Too Many Tools, Too Complex, Build Your Own Workflow in Seconds โก
TL;DR simple, lightweight, and fully hackable observability + evals. Cut out the noise, and focus on the data + metrics that matter to you ๐ฏ
โฌ๏ธ Clone our Open Source Framework
๐Chat with us
๐ซ Problem
Current LLM infra has too many layers of abstraction, with rigid schemas and several concepts to learn. This obfuscates way too much from users, making things seem much more complex than they really are, and paradoxically it also makes these tools rigid and inflexible.
LLMs are also only a part of a broader product, and so your observability and eval infra shouldnโt be exclusively obsessed with LLM abstractions. The product and your users are what really matter, and the needs can vary wildly from product-to-product. Your observability and eval tooling should reflect that!
๐ง Not Anymore
The core building block is simple, just โunify.logโ. This lets you store any kind of data to your console for easy visualization, grouping, sorting, and plotting.
You can then hack together your own custom interface, for whatever you want using three basic tile types: Tables ๐ข, Views ๐ and Plots ๐
Use these three primitives to:
โ create + visualize your datasets in a new tab
โ monitor and probe production traffic in a new tab, with or without LLMs
โ start an evaluation flywheel in a new tab, with or without LLMs
๐ optimize your product for your users, with or without LLMs
๐ง whatever else you can think of!
Check out our minimal demo, explaining how to use these building blocks to ship with speed and clarity โก
https://youtu.be/fl9SzsoCegw?si=GKDi4TLEUXNgfWy2
Getting Started ๐ ๏ธ
Sign up, check out our repo, and you're ready to go!
from unify import Unify
unify = Unify("gpt-4o@openai")
msg = "hello"
response = unify.generate(msg)
unify.log(message=msg, response=response)
# visualize at https://console.unify.ai
๐ Our Ask
Give Unify a try, and let us know what you think! Check out our walkthrough, and if you like it, then give our repo a star ๐ If you donโt think itโs useful, please tell us. Weโd love to know!