auto_awesome_motion View all 2 versions
organization

OPTINVENT

Country: France
7 Projects, page 1 of 2
  • Funder: EC Project Code: 287595
    Partners: University of Cologne, CEA, YUO, MICROOLED SARL, NOVERO GMBH, OPTINVENT, MERCK KOMMANDITGESELLSCHAFT AUF AKTIEN
  • Funder: EC Project Code: 233605
    Partners: THALES AVIONICS SAS, DIEHL AEROSPACE GMBH, ALITALIA S.p.A., DTU, IMEC, OPTINVENT, Technological Educational Institute of Piraeus, ALENIA AERMACCHI SPA, University of Malta
  • Open Access mandate for Publications and Research data
    Funder: EC Project Code: 732515
    Overall Budget: 3,990,210 EURFunder Contribution: 3,990,210 EUR
    Partners: ISTITUTO EUROPEO DI ONCOLOGIA SRL, OPTINVENT, C-Tech Innovation (United Kingdom), NBT, HIT HYPERTECH INNOVATIONS LTD, CYBERNETIX, CERTH, University of Bristol, IDIOTIKO POLIIATRIO ORTHOPAIDIKIS CHIROURGIKIS ATHLITIKON KAKOSEON KAI APOKATASTASIS ETAIRIA PERIORISMENI EFTHINIS, POLITECNICO DI MILANO...

    Robot-assisted minimally invasive surgery (RAMIS) offers many advantages when compared to traditional MIS, including improved vision, precision and dexterity. While the popularity of RAMIS is steadily increasing, the potential for improving patient outcomes and penetrating into many procedures is not fully realised, largely because of serious limitations in the current instrumentation, control and feedback to the surgeon. Specifically, restricted access, lack of force feedback, and use of rigid tools in confined spaces filled with organs pose challenges to full adoption. We aim to develop novel technology to overcome barriers to expansion of RAMIS to more procedures, focusing on real-world surgical scenarios of urology, vascular surgery, and soft tissue orthopaedic surgery. A team of highly experienced clinical, academic, and industrial partners will collaborate to develop: i) dexterous anthropomorphic instruments with minimal cognitive demand ii) a range of bespoke end-effectors with embedded surgical tools using additive manufacturing methods for rapid prototyping and testing utilizing a user-centred approach, iii) wearable multi-sensory master for tele-operation to optimise perception and action and iv) wearable smart glasses for augmented reality guidance of the surgeon based on real-time 3D reconstruction of the surgical field, utilising dynamic active constraints and restricting the instruments to safe regions. The demonstration platform will be based on commercial robotic manipulators enhanced with the SMARTsurg advanced hardware and software features. Testing will be performed on laboratory phantoms with surgeons to bring the technology closer to exploitation and to validate acceptance by clinicians. The study will benefit patients, surgeons and health providers, by promoting safety and ergonomics as well as reducing costs. Furthermore, there is a potential to improve complex remote handling procedures in other domains beyond RAMIS.

  • Open Access mandate for Publications
    Funder: EC Project Code: 731974
    Overall Budget: 4,438,440 EURFunder Contribution: 3,816,440 EUR
    Partners: CEA, MECTRON SPA, OKEY, TUM, SCOPIS GMBH, UNIBO, UniPi, OPTINVENT, Charité - University Medicine Berlin, SSSA...

    The idea of integrating the surgeon’s perceptive efficiency with the aid of new augmented reality (AR) visualization modalities has become a dominant topic of academic and industrial research in the medical domain since the 90’s. AR technology appeared to represent a significant development in the context of image-guided surgery (IGS). The quality of the AR experience affects the degree of acceptance among physicians and it depends on how well the virtual content is integrated into the real world spatially, photometrically and temporally. In this regard, wearable systems based on head-mounted displays (HMDs), offer the most ergonomic and easily translatable solution for many surgeries. Most of the AR HMDs fall into two categories according to the see-through paradigm they implement: video see-through (VST) and optical see-through (OST) HMDs. In OST systems, the user’s direct view of the real world is augmented with the projection of virtual information into the user’s line of sight. Differently, in VST systems the virtual content is merged with images captured by two external cameras anchored to the visor. With respect to technological and human-factor issues, both the approaches have their own strengths and shortcomings. In this project, we identified in a hybrid OST/VST HMD, a disruptive solution for improving surgical outcomes. The application driven device will be developed from existing systems and exploiting the knowhow acquired within the consortium on photonics KET technologies. The resulting device will undergo three clinical trials whose results will be fundamental towards a straight industrial exploitation comprising economic viability analysis. Video-Optical See Through AR surgical System (VOSTARS) will be the first hybrid see-through HMD surgical navigator. Further, albeit VOSTARS will be specifically designed for medical procedures, its design is aimed to evolve into a multi-purpose AR platform for HMDs.

  • Open Access mandate for Publications and Research data
    Funder: EC Project Code: 101016499
    Overall Budget: 6,461,390 EURFunder Contribution: 6,461,390 EUR
    Partners: Chemnitz University of Technology, CEA, ATOS IT, OPTINVENT, TTI, IMEC, VIZLORE LABS FOUNDATION, NOKIA NETWORKS FRANCE, DIAKINISIS S.A., WINGS ICT...

    In future 6G wireless networks, it is imperative to support more dynamic resourcing and connectivity to improve adaptability, performance, and trustworthiness in the presence of emerging human-centric services with heterogeneous computation needs. DEDICAT 6G aims to develop a smart connectivity platform using artificial intelligence and blockchain techniques that will enable 6G networks to combine the existing communication infrastructure with novel distribution of intelligence (data, computation and storage) at the edge to allow not only flexible, but also energy efficient realisation of the envisaged real-time experience. DEDICAT 6G takes the next vital step beyond 5G by addressing techniques for achieving and maintaining an efficient dynamic connectivity and intelligent placement of computation in the mobile network. In addition, the proposal targets the design and development of mechanisms for dynamic coverage extension through the exploitation of novel terminals and mobile client nodes, e.g., smart connected cars, robots and drones. DEDICAT also addresses security, privacy and trust assurance especially for mobile edge services and enablers for novel interaction between humans and digital systems. The aim is to achieve (i) more efficient use of resources; (ii) reduction of latency, response time, and energy consumption; (iii) reduction of operational and capital expenditures; and (iv) reinforcement of security, privacy and trust. DEDICAT 6G will focus on four use cases: Smart warehousing, Enhanced experiences, Public Safety and Smart Highway. The use cases will pilot the developed solutions via simulations and demonstrations in laboratory environments, and larger field evaluations exploiting various assets and testing facilities. The results are expected to show significant improvements in terms of intelligent network load balancing and resource allocation, extended connectivity, enhanced security, privacy and trust and human-machine interactions.