Project Document
Whitepaper
Public funding whitepaper for AI Companion. This document explains the product vision, the embodiment model, the current development direction, the funding rationale, and the limits of what is being promised at this stage.
Version: Funding Draft v2.0. Date: March 22, 2026.
1. Executive Summary
AI Companion is a concept-stage desktop robot designed to give a physical body to your AI, including Molbot or any API-connected AI system.
It is not a standalone AI, it is not a closed ecosystem, and it is not a toy built around pre-scripted behavior. The idea is simple: the user connects an AI endpoint and the robot comes alive.
- Face display
- Expressive arms
- Wheels for indoor movement
- Camera, microphone, and speaker
- Wi-Fi connectivity
- Simple setup flow for an external AI brain
2. Project Status
AI Companion is currently in the project and funding stage. The product is not yet in mass production and the industrial design is still evolving.
The image used in presentations should be treated as concept render and target direction. The first version requires funding for prototype work, engineering, and manufacturing preparation.
Estimated launch window after successful funding and development progress: 7 to 12 months. This remains an estimate only and not a guaranteed delivery promise.
3. The Problem
Most AI systems still live behind screens, speakers, or browser tabs. They can answer questions, generate text, and reason over data, but they do not have real physical presence in a room.
As AI becomes more personal, more persistent, and more autonomous, a new question appears: what does it mean for an AI to have a body?
- No physical presence
- No spatial feedback
- No visible emotional expression in a room
- No embodied interaction with a user
4. The Solution
AI Companion is a universal robotic body for AI. Instead of building yet another locked assistant, the goal is to give an external AI a physical body it can use in the real world.
The robot handles embodiment. The connected AI handles intelligence. This makes the product more flexible, more upgradeable, and more future-proof over time.
- Speak
- See
- Move through indoor space
- React through sensors, including touch
- Express emotion and social presence
5. What the Product Is
The target first product is a compact wheeled desktop robot designed to feel alive, expressive, and present.
- Front face screen for eyes, expressions, text, and status
- Two expressive arms for gestures
- Differential-drive wheels for movement
- Front-facing camera
- Microphone array for voice capture
- Speaker for speech output
- LEDs for visual feedback
- Planned touch and tactile sensing on the arms and body
- Obstacle and edge awareness for safer motion
- Rechargeable battery operation
- Wi-Fi connectivity
6. What the Product Is Not
AI Companion is not a fully autonomous humanoid robot, not a local AGI box, not a manipulation robot for real physical work, and not a closed assistant tied forever to one cloud vendor.
For v1, the arms are intended for expression and character, not heavy manipulation. The robot is designed for indoor, low-speed movement and interaction.
7. Core Product Thesis
Bring your own AI. Give it a body. That is the product thesis.
The robot should not force users into one model, one app, or one subscription lock-in. Instead, it should let many compatible AI systems become the brain.
- Molbot
- A personal AI agent
- A cloud LLM stack
- A local AI system
- A research agent
- A custom enterprise assistant
8. How It Works
AI Companion is designed around a simple connection model. The robot becomes the physical interface for the chosen AI endpoint.
The robot is the body. The AI is the brain.
- Robot sends: audio input, camera input, sensor state, battery and device state
- AI sends: speech output, movement commands, expression commands, LED commands, and behavior commands
- 1. User powers on the robot
- 2. Robot creates a setup flow
- 3. User enters Wi-Fi and Brain API details
- 4. Robot connects to the chosen AI endpoint
- 5. The AI receives streams and sends commands
- 6. The robot becomes the physical interface for that AI
9. Molbot Integration and Universal Brain Mode
The platform is being designed with two primary modes. In Molbot Integration mode, the robot can connect directly to Molbot and become its physical embodiment.
In Brain API mode, the robot connects to any compatible external AI endpoint through a documented control interface. This keeps the product useful even as models, providers, and software ecosystems change over time.
10. Key Capabilities Planned for V1
These features represent the minimum required for the robot to feel like a real AI companion body rather than just a moving speaker.
- Expressive animated face on screen
- Voice conversation
- Camera-based perception
- Wheeled movement
- Expressive arm gestures
- LED status ring and indicators
- Touch and tactile awareness
- Obstacle awareness
- Edge and fall awareness
- Remote or local-network brain connection
- Plug-and-play setup flow
11. Preliminary Product Specifications
The following specifications are preliminary targets for v1 and may evolve during engineering. These are target design goals for the first batch, not final locked specifications.
- Height target: approximately 22 to 28 cm
- Width target: approximately 16 to 20 cm
- Weight target: approximately 1.1 to 1.8 kg
- Indoor differential-drive wheel base
- Expressive screen face
- Two gesture arms
- Front-facing camera
- Microphone array
- Integrated speaker
- Wi-Fi connectivity
- Rechargeable battery operation
- Expected runtime target: approximately 1.5 to 3 hours of mixed active use
12. Why External AI Instead of Built-In AI
The product is intentionally designed around an external brain. That choice matters because it allows continuous AI upgrades without replacing the robot.
It also supports Molbot and third-party systems, local or cloud deployment options, lower hardware complexity inside the robot, and more flexibility for developers and owners.
13. User Experience
The ideal experience is intentionally simple: you unbox the robot, turn it on, connect it to your AI, and it comes alive.
The goal is not a developer-only setup path full of firmware flashing and wiring diagrams. The aim is an approachable setup flow for hobbyists, developers, early adopters, researchers, and users who already have an AI and want a body for it.
14. Why Funding Is Needed
This project is hardware, and hardware needs capital before it becomes real. The goal of the funding phase is not to pretend the product is already finished.
The goal is to turn a clear concept into a manufacturable first batch.
- Prototype engineering
- Electronics integration
- Mechanical design
- Enclosure and manufacturing preparation
- Firmware and setup software
- Safety validation
- Pilot production
- First-batch assembly and testing
15. How the Funding Would Be Used
Funds would be used across the areas required to move from concept to real hardware. This is not a marketing-only campaign. It is a build campaign for a real hardware product.
- Prototype electronics and robotics hardware
- Mechanical design and enclosure development
- Display, audio, and motion integration
- Firmware and setup software
- Safety and reliability testing
- Pilot manufacturing preparation
- Packaging and first-batch assembly
- Spare parts, failures, and contingency
16. Estimated Timeline After Successful Funding
The current target timeline for earliest first-batch readiness and launch is only an estimate. Hardware development always carries execution risk.
- 1. Month 0 to 1: funding close, final product scope freeze, mechanical and electronics planning
- 2. Month 1 to 3: engineering prototypes, motion and expression testing, first working embodied AI demos
- 3. Month 3 to 5: electronics refinement, enclosure iteration, software integration
- 4. Month 5 to 7: design validation, pilot manufacturing preparation, packaging and assembly workflow
- 5. Month 7 to 12: first batch production, testing, earliest launch window, and limited first-batch availability
17. What Supporters Should Expect
Supporters should expect a concept-stage hardware campaign, regular development updates, transparent reporting on progress and delays, and possible early access only after engineering and validation are complete.
Supporters should not expect instant delivery, a retail-polished product in the first week, a guaranteed unit, or a guarantee that the exact concept render is the final industrial design.
The honest expectation is participation in the creation of a first-generation product and support for its development journey.
18. Risks and Honest Constraints
We want to be explicit about the reality of the project and the uncertainty that comes with building hardware.
- Hardware complexity
- Power and battery optimization
- Manufacturing iteration
- Supply chain changes
- Firmware stability
- Production timelines
- Shipping delays
19. Why This Matters
AI will not remain only a text box forever. There will be a category of products that turns software intelligence into embodied presence.
Some products will be closed and vertically integrated. Our belief is that there should also be an open, brain-agnostic path: connect your AI and bring it to life.
AI Companion is intended to be that path.
20. Who It Is For and Who It Is Not For
AI Companion is being built for Molbot users, AI enthusiasts, developers, embodied AI experimenters, robotics hobbyists, research labs, creative technologists, and early hardware supporters.
It is probably not for people looking for a cheap toy robot, buyers who need a fully finished retail product immediately, or users expecting heavy physical manipulation or advanced autonomous robotics in v1.
The first version is about embodied presence, not full humanoid utility.
21. Supporter Framing
The most honest way to understand this campaign is that you are supporting the development of a real embodied AI hardware platform.
If the product reaches launch, supporters may receive launch notice, early access opportunities, and future discount eligibility. This is not a same-week preorder. It is a funding-stage support pledge for a hardware project in development.
22. Long-Term Vision
The long-term vision extends beyond a single robot. The first milestone is much simpler: build the first lovable, useful, AI-agnostic body for Molbot, personal AI agents, and similar systems.
- Docking and charging base
- Accessories and modular shells
- Better autonomy and navigation
- Multiple AI profiles
- Local and cloud brain switching
- Developer SDKs and behavior plugins
- Profile marketplace
23. Closing Statement
AI Companion is not being built as a gimmick. It is being built around one product belief: AI deserves a body.
If funding succeeds, the next step is to turn that belief into a working first batch.
Connect your AI. Bring it to life.
24. Whitepaper Update Notice
This whitepaper reflects the current direction of the Project at the time of publication.
The project creator reserves the right to modify, update, replace, or expand this whitepaper, the product specifications, the visuals, the timeline, the support model, and any other published project materials as development continues.