An Air-Gapped Approach to Maximizing Developer Productivity with Pieces Copilot+ and Live Context

get_pieces

Pieces 🌟

Posted on June 17, 2024

An Air-Gapped Approach to Maximizing Developer Productivity with Pieces Copilot+ and Live Context

Live Context Privacy & Security.

Managing chaotic workflows and maintaining consistent levels of productivity is a challenging task for software developers dealing with new languages, ballooning documentation, and overall more information overload than ever. At Pieces for Developers, we’ve consistently aimed to address these challenges directly, creating tools that empower developers to work smarter and more efficiently. Our latest innovation, Pieces Copilot+ with Live Context, is a huge milestone in this journey, delivering a feature that brings more harmony between human and AI workstreams.

Read on to learn more about how we developed this breakthrough feature to be local-first for air-gapped security and lightning-fast speed, and register for our AMA live stream on Tuesday, June 18, 2024 for an even deeper dive under the hood.

The Vision Behind Live Context

From the inception of Pieces for Developers, our goal has been clear: to enhance developer productivity through intelligent, contextual tools. We began by offering a secure place for storing valuable code snippets, progressed to proactively saving and contextualizing them, and then introduced one of the first on-device LLM-powered AI copilots. With Pieces Copilot+, we’re bringing forth Live Context—a feature that enables the world’s first temporally grounded copilot.

Live Context is designed to understand and adapt to your workflow, allowing the Pieces Copilot to provide relevant assistance based on when, where, and how you work–empowering you to remember anything, and interact with everything. Available on macOS, Windows, and Linux, this feature captures and processes workflow data on-device, ensuring both performance and privacy.

How Live Context Enhances Your Workflow

1. Real-Time Workflow Assistance:

  • Live Context helps you keep track of your tasks across different tools and sessions. Whether you’re switching between research in a browser, discussions in collaboration tools like Slack, or coding in your IDE, Pieces Copilot+ remembers your activities and provides timely, context-aware assistance.
  • Example: Ask, “What was I working on an hour ago?” and receive a detailed response that helps you pick up right where you left off.

Using Pieces Copilot to determine what you were doing earlier in the day.

2. Simplifying Complex Tasks:

  • With Live Context, you can streamline error resolution and project hand-offs. By capturing relevant workflow data, the copilot can offer precise guidance without needing you to manually input context.
  • Example: When you encounter an error, simply ask Pieces Copilot+, “How can I resolve this issue in the terminal in IntelliJ?” and it will utilize the relevant context to provide a solution.

Asking Pieces Copilot how to resolve an error in the terminal.

3. Enhancing Developer Communication:

  • The Workstream Pattern Engine within Pieces Copilot+ gathers and processes interaction data to help you manage conversations and collaborations more effectively. This includes generating summaries and action items based on your discussions and activities.
  • Example: Use it to generate talking points for your daily standup or summarize key themes from a list of unread conversations.

Using Pieces Copilot to generate talking points for a daily standup.

Privacy and Security at the Core

We understand that privacy and security are paramount concerns for developers, especially when dealing with sensitive information and proprietary code. Pieces Copilot+ with Live Context has been designed with these considerations at the forefront. Here’s a deeper look into how we ensure your data remains secure:

1. On-Device Processing:

All workflow data captured by Live Context is processed and stored locally on your device. This ensures that sensitive information never leaves your machine unless you explicitly choose to share it. By operating in an air-gapped capacity, we eliminate the risk of data breaches associated with network transmissions.

2. Intelligent Visual Snapshots:

Instead of continuously recording your screen (which would be intrusive and resource-intensive), PiecesOS detects when a new, distinct visual focus occurs. It captures intelligently timed snapshots of application visuals, not full screenshots. These snapshots are then processed on-device using segmentation and visual reduction algorithms, ensuring only new and relevant information is analyzed.

3. Temporary Data Handling:

Extracted text from these snapshots is temporarily stored in memory and then permanently deleted after processing, typically within 100 milliseconds. If the data does not meet a specific relevance threshold, it is discarded after 12 hours, acting as a short-term memory system to ensure your workflow remains uncluttered.

4. Data Compression and Redaction:

To manage storage efficiently, PiecesOS compresses relevant data to about 10% of its original size using an on-device transformer model. This process, known as summarization and redaction, also includes best-effort filtering of sensitive information like API keys and PII. This ensures that only the most critical and non-sensitive data is retained for context generation.

5. Secure Local Storage:

Post-processing, the summarized data is embedded into a local vector database, which remains on your device. This data can be queried during retrieval-augmented generation (RAG) sessions, allowing the copilot to provide contextual assistance without ever transmitting data to the cloud.

6. Optional Cloud Integration:

While we prioritize on-device processing, you have the flexibility to use cloud-based Large Language Models (LLMs) if preferred. Even in this case, only the refined context is sent to the cloud, minimizing exposure. If using our on-device LLM runtimes (like Mistral or Llama 3), the context never leaves your local environment, ensuring maximum privacy.

7. Secure Integrations:

For integrations like the VS Code Plugin, Pieces Copilot+ retrieves and processes stack traces and other relevant data via local HTTP/GRPC connections, ensuring that all data exchanges remain secure and within your control.

Getting Started with Live Context

Enabling Live Context is straightforward:

  1. Enable the Workstream Pattern Engine: Navigate to the Machine Learning section of the Pieces Desktop App settings and activate the engine.
  2. Adjust Permissions: Follow the prompts to update necessary permissions (if required).
  3. Start Using Live Context: Begin your usual work, and then initiate a conversation with the Pieces Copilot, utilizing Live Context for enhanced assistance.

Conclusion

The launch of Pieces Copilot+ with Live Context marks a significant milestone in our mission to boost developer productivity. By leveraging temporal context and on-device processing, we offer a tool that not only helps you remember and manage your tasks but also ensures your data remains secure and private.

We’re excited to see how Live Context transforms your workflow and look forward to your feedback. Let’s continue building a community where developers can thrive, leveraging tools that prioritize performance, security, and privacy.

Feel free to reach out to us on Discord with your thoughts, questions, and constructive criticism. Together, we can refine and perfect this feature, making it an indispensable part of every developer’s toolkit.

Stay tuned for more updates, and happy coding!

💖 đŸ’Ș 🙅 đŸš©
get_pieces
Pieces 🌟

Posted on June 17, 2024

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related