Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

GitHub - louis030195/screen-pipe: Turn your screen into actions (using LLMs). Inspired by adept.ai, rewind.ai, Apple Shortcut. Rust + WASM.

Jun 26, 2024 - github.com
The article discusses a project called ScreenPipe, which is designed to turn screen data into actionable tasks using Large Language Models (LLMs). The project, inspired by adept.ai, rewind.ai, and Apple Shortcut, is written in Rust and WASM. It captures screen data and uses an LLM like OpenAI's to process text and images, with an example provided of analyzing sales conversations. The project is currently in its alpha stage and is considered experimental.

The article also emphasizes the importance of open sourcing such technologies, arguing that recent AI breakthroughs have shown the importance of context and that technologies enabling this level of personalization should be accessible to all developers. The project is designed to record the screen and associated metadata and pipe it somewhere. The author encourages contributions to the project and provides information on how to get involved. The project is licensed under the MIT license.

Key takeaways:

  • The markdown data is about a project called ScreenPipe, which turns your screen into actions using Large Language Models (LLMs). It's inspired by adept.ai, rewind.ai, and Apple Shortcut.
  • ScreenPipe is written in Rust + WASM and can process text and images for analyzing sales conversations. It uses server-side code written in TypeScript and uses a Large Language Model like OpenAI's.
  • The project is currently in the alpha stage and runs on the author's computer. It captures things and performs actions based on the captured data.
  • The author believes that the technologies enabling personalization in AI should be open source to accelerate access to the next stage of our evolution. Therefore, the ScreenPipe project is open source and contributions are welcome.
View Full Article

Comments (0)

Be the first to comment!