mem0ai / mem0
The memory layer for Personalized AI
See what the GitHub community is most excited about this week.
The memory layer for Personalized AI
#1 Locally hosted web application that allows you to perform various operations on PDF files
A one-stop, open-source, high-quality data extraction tool, supports PDF/webpage/e-book extraction.一站式开源高质量数据提取工具,支持PDF/网页/多格式电子书提取。
😎 Awesome lists about all kinds of interesting topics
A multi-platform proxy client based on ClashMeta,simple and easy to use, open-source and ad-free.
Ollama Python library
NVIDIA Linux open GPU kernel module source
Find and verify secrets
Open source API development ecosystem - https://hoppscotch.io (open-source alternative to Postman, Insomnia)
Distribute and run LLMs with a single file.
Scripts for fine-tuning Meta Llama3 with composable FSDP & PEFT methods to cover single/multi-node GPUs. Supports default & custom datasets for applications such as summarization and Q&A. Supporting a number of candid inference solutions such as HF TGI, VLLM for local or cloud deployment. Demo apps to showcase Meta Llama3 for WhatsApp & Messenger.
NativeLink is an open source high-performance build cache and remote execution server, compatible with Bazel, Buck2, Reclient, and other RBE-compatible build systems. It offers drastically faster builds, reduced test flakiness, and specialized hardware.
API Documentation Browser
AutoMQ is a cloud-first alternative to Kafka by decoupling durability to S3 and EBS. 10x cost-effective. Autoscale in seconds. Single-digit ms latency.
🎨 Diagram as Code for prototyping cloud system architectures
Get up and running with Llama 3.1, Mistral, Gemma 2, and other large language models.