Fari is designed to deliver clinical-grade health monitoring with the warmth of a trusted friend — combining advanced AI with an unwavering commitment to dignity, privacy, and human authority.
Fari is being designed to address the three dimensions of eldercare that matter most — clinical health monitoring, emotional wellbeing, and physical safety.
Designed for continuous, non-invasive vital sign tracking — heart rate, SpO₂, respiratory rate, skin temperature, and blood pressure estimation. AI models are being developed to detect subtle health pattern changes before they become emergencies.
Our Affective Multi-modal Depression Classifier (AMDC) is designed to analyse facial micro-expressions, vocal prosody, and behavioural patterns to assess emotional wellbeing — enabling proactive companionship and caregiver alerts.
Fall detection and prevention using the Human Trajectory Detection system (HTD-IRL). Designed to predict fall risk from gait analysis and environmental scanning, with target alert latency of under 3.2 seconds from detection to caregiver notification.
A selection of Fari's planned capabilities — each governed by our SEOM ethical framework to ensure dignity, privacy, and human authority at every interaction.
Designed to deliver personalised medication reminders with visual and audio cues, confirm intake via computer vision, and alert caregivers to missed doses. SEOM Rule F02 governs all medication interactions.
Natural language interaction designed to reduce loneliness. Fari's Socially-aware Trajectory Understanding Model (STUM) is designed to calibrate conversation style based on resident preferences and emotional state.
Using mmWave radar and depth cameras, Fari is designed to monitor gait patterns and predict fall risk. The HTD-IRL model uses inverse reinforcement learning to identify pre-fall biomechanical signatures.
WBT1 mobility platform designed for smooth, safe navigation in care environments. Multi-sensor SLAM with LiDAR, depth cameras, and ultrasonic arrays enables obstacle avoidance and room-to-room autonomy.
Our Vimeo showcase brings Fari to life — the team, the technology, and the vision behind InGen Dynamics' eldercare AI companion. Features shown are in active development and subject to change.
We are actively engaging with care providers, health systems, and research partners. Let's explore how Fari could support your residents and staff.
Every feature Fari is being designed to deliver — each governed by the SEOM ethical framework. All capabilities represent design targets and are subject to change during development.
Designed to monitor heart rate, SpO₂, respiratory rate, skin temperature, and blood pressure estimation continuously via non-invasive sensors.
Non-contact sleep monitoring designed to track sleep stages, breathing patterns, and nocturnal movement using mmWave radar and ambient sensors.
AI designed to establish individual baseline health profiles during a 14-day onboarding period, enabling detection of subtle deviations from personal norms.
Designed to track room temperature, humidity, air quality, and light levels to identify environmental factors affecting resident health and comfort.
Designed to synchronise observations with Electronic Health Records via HL7 FHIR R4 APIs, enabling seamless clinical data flow with target sync latency under 5 minutes.
Personalised audio and visual medication reminders designed with escalation protocols — gentle prompt → detailed instructions → caregiver alert. Governed by SEOM Rule F02.
Computer vision designed to visually confirm medication intake and detect potential missed doses, logging all events for clinical review.
AMDC model designed to analyse facial micro-expressions, vocal prosody, and behavioural patterns to assess emotional state with target accuracy of 87.4%.
Natural language interaction designed to reduce feelings of loneliness. STUM model calibrates conversation style, topic selection, and emotional tone to each resident.
Designed to suggest and facilitate cognitive exercises, music therapy, reminiscence activities, and gentle physical exercises tailored to individual capability and preference.
Video calling with auto-framing (target 1080p), wellbeing digest sharing, and activity photo sharing — all governed by SEOM F08 and F10 consent rules.
Multi-sensor fall detection designed with target response time of under 3.2 seconds from detection to caregiver notification. Uses depth cameras and mmWave radar.
HTD-IRL model designed to analyse gait patterns using inverse reinforcement learning to identify pre-fall biomechanical signatures and alert staff proactively.
SEOM Rule F03 mandates that emergency detection always takes priority over all other operations. Designed with multi-channel alerting: on-device, dashboard, mobile push, and SMS.
Designed for residents with cognitive impairment — monitors movement patterns and triggers alerts when unusual wandering behaviour is detected, especially during night hours.
WBT1 mobility platform with multi-sensor SLAM using LiDAR, depth cameras, and ultrasonic arrays. Designed for smooth, safe movement through complex care environments.
Designed for scheduled and on-demand room visits, enabling Fari to check on residents, deliver medication reminders, and provide companionship across an entire care facility.
Real-time obstacle detection and path planning designed with safety-first principles — Fari is designed to always yield to humans and stop immediately on contact detection.
Fari's technology stack combines edge AI processing, a 14-sensor suite, and the SEOM ethical framework — all designed to run locally on NVIDIA Jetson Orin NX for privacy-first operation.
Each sensor is selected for its contribution to resident safety and wellbeing — no data is collected without clinical or care justification.
Foundation reasoning model designed for clinical decision support. Uses relative policy comparison to generate contextual health assessments without making diagnoses (SEOM F09).
Real-time ethical governance layer. 12 immutable rules with severity tiers (CRITICAL/HIGH/STANDARD). λ=10.0 penalty weight ensures ethical compliance overrides all other objectives.
Designed to model and predict social dynamics — calibrating Fari's conversation style, physical proximity, and interaction timing based on individual resident preferences.
Designed to assess emotional wellbeing through facial expression, vocal prosody, and activity pattern analysis. Target accuracy: 87.4%. Reports observations, never diagnoses (SEOM F09).
Uses inverse reinforcement learning to model human movement patterns and predict fall risk from gait analysis. Designed to detect pre-fall biomechanical signatures for preventive alerts.
Designed for multi-unit deployment coordination — enabling multiple Fari units to share observations, coordinate patrols, and balance workload across a care facility.
The Supervised Ethical Override Module governs every AI decision Fari makes. These rules are immutable — they cannot be overridden by any other system objective, even to improve performance.
Fari's engineering process follows a formal V-Model systems engineering lifecycle with SEOM ethical gates at every phase. This page documents our requirements, subsystem architecture, data flow, and development tracks.
Each development phase is gated by SEOM compliance review. The principle that λ=10.0 is non-negotiable at every gate ensures ethical considerations are never traded against performance.
| ID | Requirement | Category | Priority | Verification |
|---|---|---|---|---|
| SYS-001 | Continuous non-invasive vital sign monitoring (HR, SpO₂, RR, temp, BP est.) | Health | MUST | Clinical trial |
| SYS-002 | Fall detection with target alert latency <3.2 seconds | Safety | MUST | Lab + field test |
| SYS-003 | SEOM ethical override at λ=10.0 — immutable, non-bypassable | Ethics | MUST | Code audit |
| SYS-004 | Edge AI processing on NVIDIA Jetson Orin NX — no raw biometric cloud transmission | Privacy | MUST | Architecture review |
| SYS-005 | Medication reminder system with 3-tier escalation (SEOM F02) | Safety | MUST | Functional test |
| SYS-006 | Autonomous navigation in care environments with obstacle avoidance | Mobility | MUST | Field test |
| SYS-007 | Multi-modal emotion recognition with target accuracy ≥87.4% | Emotional AI | MUST | Validation study |
| SYS-008 | EHR integration via HL7 FHIR R4 with target sync <5 minutes | Integration | MUST | Integration test |
| SYS-009 | GDPR Article 9 consent management — granular, revocable | Privacy | MUST | Legal review |
| SYS-010 | Target battery runtime ≥12 hours per charge cycle | Hardware | MUST | Endurance test |
| SYS-011 | Natural language conversation with context-aware personalisation | Emotional AI | SHOULD | User study |
| SYS-012 | Sleep quality monitoring — non-contact, mmWave radar-based | Health | SHOULD | Clinical validation |
| SYS-013 | Wandering detection for memory care residents | Safety | SHOULD | Field test |
| SYS-014 | Video calling with auto-framing (target 1080p) | Family | SHOULD | Functional test |
| SYS-015 | Multi-robot coordination for facility-wide deployment (CRL-MRS) | Platform | SHOULD | Simulation + field |
| SYS-016 | Gait analysis and fall prevention via HTD-IRL model | Safety | SHOULD | Clinical study |
| SYS-017 | Environmental monitoring — air quality, humidity, temperature, light | Health | COULD | Sensor validation |
| SYS-018 | Activity engagement — cognitive exercises, music therapy, reminiscence | Emotional AI | COULD | User study |
| SYS-019 | CQC/CMS reporting template generation | Compliance | COULD | Regulatory review |
| SYS-020 | EU AI Act compliance documentation and audit trail | Compliance | COULD | Legal audit |
Medical-grade ABS shell housing all subsystems. Target IPX4, ~120cm height, ~25kg. Designed for care environments with rounded edges, no pinch points, and antimicrobial surfaces.
8" AMOLED touchscreen (target 1920×1200), RGB LED ring for ambient status, 4-element MEMS microphone array, and dual speakers for voice interaction.
Complete sensor array: mmWave radar, depth camera, RGB camera, thermal camera, microphone array, environmental sensors, LiDAR, ultrasonics, IMU, ambient light, battery monitor, contact sensors, encoders, PPG.
NVIDIA Jetson Orin NX (100 TOPS), 16GB LPDDR5, 256GB NVMe SSD. Runs all AI inference on-device. Designed for privacy-first edge computing with no raw biometric data transmitted to cloud.
Hosts GRPO, SEOM, STUM, AMDC, HTD-IRL, and CRL-MRS models. Model orchestration, inference pipeline, and SEOM ethical compliance layer. All inference on Jetson edge.
Hardware safety interlocks, emergency stop circuits, contact-detect bumpers, tilt sensors. Designed for immediate motor cutoff on obstacle contact. Software watchdog for AI system health.
Differential drive with precision motor control, multi-sensor SLAM (LiDAR + depth + ultrasonics), path planning, and autonomous docking. Designed for smooth, silent movement.
HL7 FHIR R4 API client for electronic health record synchronisation. Designed for bi-directional data flow with target sync latency <5 minutes. mTLS encrypted transport.
Falls, cardiac events, unresponsive resident. Multi-channel alert: on-device alarm, dashboard popup, mobile push, SMS.
Vital trend deviations, missed medications, unusual activity patterns. Dashboard alert with caregiver notification.
Wellbeing updates, activity summaries, environmental conditions. Logged and batched for shift handover reports.
Each track delivers independently but synchronises at gate reviews. SEOM compliance is verified at every gate — no track proceeds without ethical clearance.
Physical platform, sensor integration, thermal management, power system, autonomous docking. Shell design, manufacturing prototyping, safety certification.
Sensor drivers, motor control, power management, safety interlocks, watchdog timers. Real-time OS on STM32 microcontrollers for deterministic response.
GRPO, SEOM, STUM, AMDC, HTD-IRL, CRL-MRS model development. Training pipeline, inference optimisation for Jetson, continuous learning with privacy-preserving federated updates.
Cloud infrastructure (Azure), FHIR R4 integration, dashboard API, real-time WebSocket notifications, data aggregation pipeline, GDPR compliance engine.
Ward overview with real-time vital status, alert management, medication workflow, shift handover auto-report, SEOM audit panel. Accessibility: WCAG 2.1 AA target. Offline capability for mobile.
Progressive Web App (iOS/Android). Wellbeing digest, video calls (1080p, auto-framing), activity photos, consent manager. GDPR Article 9 consent matrix. Reassurance-first design per SEOM F08/F10.
Owns Interface Control Document for all 15 inter-subsystem interfaces. IVT test harness, end-to-end latency verification, SEOM full-system audit, EU AI Act compliance documentation, CQC/CMS reporting templates.
Fari's UX design system is built around a single principle: every interface decision must serve the resident, the caregiver, and the family simultaneously — each with radically different cognitive loads, emotional contexts, and time pressures.
Fari operates at the intersection of clinical care and human dignity. Every interface element is filtered through these principles in order of priority.
Every screen element is filtered through SEOM Rule F04. No labels that pathologise, no alerts that embarrass. Language always uses observation framing — "Fari observed elevated heart rate" not "cardiac event."
Colour temperature, animation easing, and language tone are calibrated to feel warm and reassuring, never clinical or alarming. The resident display is designed never to show anything anxiety-inducing.
Information density varies by audience: residents see simple, large-type reassurance; caregivers see data-rich dashboards; families see curated wellbeing digests. Same data, three interfaces.
Privacy is not a settings toggle — it is embedded in the interface architecture. Camera feeds are processed on-device and never stored. The resident display cannot show raw biometric data.
Every AI recommendation is visually differentiated from confirmed information. Confidence scores are displayed alongside all AI outputs. Action buttons always say "Suggest" not "Do."
While the visual language is warm, the underlying data presentation is clinically precise. Vital signs are displayed with proper units, reference ranges, and trend indicators.
Same data, three radically different presentations — each optimised for the cognitive load, emotional context, and time pressure of its audience.
Fari's interfaces are being designed to meet WCAG 2.1 AA standards across all surfaces, with specific accommodations for the common sensory, motor, and cognitive challenges faced by elderly residents.
We welcome enquiries from healthcare institutions, research partners, government bodies, and those interested in following InGen Dynamics' development journey. Please get in touch via the InGen Dynamics website.
All product enquiries are handled through the InGen Dynamics corporate website.