Let's kill the iPhone: Breakthrough technologies that will serve as the foundation for the next generation of computing
The User Understanding system processes user data through a multi-stage pipeline that transforms raw information into structured knowledge. When data streams are ingested, they undergo a backfill process that extracts insights across various dimensions using an LLM. These insights are stored as structured observations in databases, creating a rich knowledge foundation.
At query time, the system leverages a sophisticated RAG pipeline combining BM25 lexical search, embedding-based semantic similarity, and temporal context expansion. What makes this architecture powerful is its self-questioning mechanism that proactively generates and answers questions from multiple perspectives, creating a network of pre-computed insights. When user queries arrive, this approach enables retrieval of both raw evidence and previously synthesized observations, allowing for rich responses.
Wireframes and Interaction Mock-ups
Allow third party developers to ask your Personal AI a question
Strong privacy controls built in to iOS
Custom image classifier for enchanced FindMy network (working prototype)