🏗️ Tech StackFrontend (Web UI): React + Vite + Tailwind CSSVoice Recognition: Web Speech APIVoice Output: ElevenLabs APIAI Chat: GPT-4o MiniSearch: SearchAPIPC Control: Electron + Node.js (to access system-level functions)🔑 Environment VariablesOPENAI_API_KEYELEVENLABS_API_KEYSEARCHAPI_KEY📁 File Structure/jarvis-lite │ ├── public/ │ ├── src/ │ ├── components/ │ │ ├── JarvisCore.jsx │ │ ├── VoiceInput.jsx │ │ ├── VoiceOutput.js │ │ ├── WakeWordListener.js │ ├── services/ │ │ ├── openaiService.js │ │ ├── searchService.js │ │ ├── elevenLabsService.js │ │ └── commandHandler.js // PC control logic │ ├── styles/ │ └── App.jsx │ ├── electron/ │ ├── main.js // Electron entry point │ ├── systemCommands.js // Node.js code to open apps, URLs │ ├── .env └── package.json 🎯 Core FeaturesWake Word ListenerListens for "Jarvis" and activates UI + mic input.AI Voice ChatGPT-4o Mini for questions, small talk, and command understanding.Voice-Controlled PC Actions"Open YouTube" → Launches in default browser"Open Notepad" or "Play music" → Uses Electron to run local commandsReal-Time SearchCalls SearchAPI if the user asks something time-sensitive.Voice OutputUses ElevenLabs to speak responses naturally.Sci-Fi UIGlowing circle face + animated holographic lines.🔌 APIs & Tools🧠 Chat: GPT-4o Mini🔉 Voice Output: ElevenLabs🌐 Search: SearchAPI🗣️ Voice Input: Web Speech API🧠 PC Control: Electron + Node.js shell commands🧠 Key StateswakeWordDetectedspokenCommandaiResponsesystemCommandIssuedisSpeaking🚀 User FlowApp listens for “Jarvis”On trigger, it records user voiceGPT processes command (e.g., “Open YouTube”)If command = app/URL:Electron runs shell.openExternal or launches a local appIf question = AI or real-world:Uses GPT or SearchAPISpeaks reply via ElevenLabsAnimations play during action✅ Must-Have FeaturesWake word detectionVoice input + AI replySci-fi glowing faceReal-time searchPC control (apps/URLs)Voice reply via ElevenLabs📝 Result SummaryA sci-fi web-based assistant that talks like JARVIS, answers your questions, searches the web, and can control your PC with voice commands using Electron..
This project outlines a sophisticated AI assistant, "Jarvis-Lite," designed to operate with voice commands, integrate with AI models, perform web searches, and even control your PC. While the full Electron-based PC control functionality cannot be demonstrated within the browser-based Next.js environment, I've provided the necessary code for the Electron part, along with a web UI that simulates the interaction.
The web UI is built with Next.js (App Router), React, and Tailwind CSS, leveraging the Web Speech API for voice input, and integrating with AI and search services via Next.js API routes to protect your API keys.
The following files are for the Electron desktop application that would handle PC control. These files are not executable within the Next.js browser environment but are provided for completeness of the "Jarvis-Lite" project.
Install Dependencies:
jarvis-lite directory.npm install or yarn install.Environment Variables:
.env.local file in the root of your jarvis-lite project.Add your API keys:
OPENAI_API_KEY="your_openai_api_key_here"
ELEVENLABS_API_KEY="your_elevenlabs_api_key_here"
SEARCHAPI_KEY="your_searchapi_key_here"<YOUR_VOICE_ID> in app/api/elevenlabs/route.ts with a specific voice ID from your ElevenLabs account.Run the Web UI:
npm run dev or yarn devhttp://localhost:3000.Electron Application (Optional - for PC Control):
To enable PC control, you would need to set up the Electron part. This typically involves:
npm run build).index.html.electron . from the electron directory, or using a build tool like electron-builder).🏗️ Tech StackFrontend (Web UI): React + Vite + Tailwind CSSVoice Recognition: Web Speech APIVoice Output: ElevenLabs APIAI Chat: GPT-4o MiniSearch: SearchAPIPC Control: Electron + Node.js (to access system-level functions)🔑 Environment VariablesOPENAI_API_KEYELEVENLABS_API_KEYSEARCHAPI_KEY📁 File Structure/jarvis-lite │ ├── public/ │ ├── src/ │ ├── components/ │ │ ├── JarvisCore.jsx │ │ ├── VoiceInput.jsx │ │ ├── VoiceOutput.js │ │ ├── WakeWordListener.js │ ├── services/ │ │ ├── openaiService.js │ │ ├── searchService.js │ │ ├── elevenLabsService.js │ │ └── commandHandler.js // PC control logic │ ├── styles/ │ └── App.jsx │ ├── electron/ │ ├── main.js // Electron entry point │ ├── systemCommands.js // Node.js code to open apps, URLs │ ├── .env └── package.json 🎯 Core FeaturesWake Word ListenerListens for "Jarvis" and activates UI + mic input.AI Voice ChatGPT-4o Mini for questions, small talk, and command understanding.Voice-Controlled PC Actions"Open YouTube" → Launches in default browser"Open Notepad" or "Play music" → Uses Electron to run local commandsReal-Time SearchCalls SearchAPI if the user asks something time-sensitive.Voice OutputUses ElevenLabs to speak responses naturally.Sci-Fi UIGlowing circle face + animated holographic lines.🔌 APIs & Tools🧠 Chat: GPT-4o Mini🔉 Voice Output: ElevenLabs🌐 Search: SearchAPI🗣️ Voice Input: Web Speech API🧠 PC Control: Electron + Node.js shell commands🧠 Key StateswakeWordDetectedspokenCommandaiResponsesystemCommandIssuedisSpeaking🚀 User FlowApp listens for “Jarvis”On trigger, it records user voiceGPT processes command (e.g., “Open YouTube”)If command = app/URL:Electron runs shell.openExternal or launches a local appIf question = AI or real-world:Uses GPT or SearchAPISpeaks reply via ElevenLabsAnimations play during action✅ Must-Have FeaturesWake word detectionVoice input + AI replySci-fi glowing faceReal-time searchPC control (apps/URLs)Voice reply via ElevenLabs
This project outlines a sophisticated AI assistant, "Jarvis-Lite," designed to operate with voice commands, integrate with AI models, perform web searches, and even control your PC. While the full Electron-based PC control functionality cannot be demonstrated within the browser-based Next.js environment, I've provided the necessary code for the Electron part, along with a web UI that simulates the interaction.
The web UI is built with Next.js (App Router), React, and Tailwind CSS, leveraging the Web Speech API for voice input, and integrating with AI and search services via Next.js API routes to protect your API keys.
The following files are for the Electron desktop application that would handle PC control. These files are not executable within the Next.js browser environment but are provided for completeness of the "Jarvis-Lite" project.
Install Dependencies:
jarvis-lite directory.npm install or yarn install.Environment Variables:
.env.local file in the root of your jarvis-lite project.Add your API keys:
OPENAI_API_KEY="your_openai_api_key_here"
ELEVENLABS_API_KEY="your_elevenlabs_api_key_here"
SEARCHAPI_KEY="your_searchapi_key_here"VOICE_ID in app/api/elevenlabs/route.ts with a specific voice ID from your ElevenLabs account.Run the Web UI:
npm run dev or yarn devhttp://localhost:3000.Electron Application (Optional - for PC Control):
To enable PC control, you would need to set up the Electron part. This typically involves:
npm run build).index.html.electron . from the electron directory, or using a build tool like electron-builder).To configure the generation, complete these steps:
wDsJlOXPqcvIUKdLXjDs use this voice id