Redesigned Requests Page for Enhanced LLM Observability

October 3, 2024

We’re excited to announce a major redesign of our Requests page, enhancing the user experience and efficiency for AI LLM observability.

Key Improvements

  • Streamlined Navigation: Quick toggle between requests without closing the drawer, allowing for faster review and comparison.
  • Compact Information Display: More data visible at a glance with a sleeker, more compact row design.
  • Reduced Visual Clutter: A cleaner interface that focuses on essential information.
  • Enhanced Time Selector: Improved configuration options and quick select features for more precise data filtering.
  • Unobstructed Page Navigation: Chat widget no longer blocks page navigation, ensuring a smoother user experience.

Benefits for LLM Developers and Data Scientists

  • Efficient Prompt Analysis: Easily view and compare prompts across multiple requests.
  • Improved Performance Monitoring: Quickly identify trends and anomalies in your LLM applications.
  • Streamlined Workflow: Navigate through large volumes of request data with ease.

This redesign reflects our commitment to providing the best tools for AI LLM observability. We’ve focused on enhancing the core features that matter most to our users, making it easier than ever to gain insights from your LLM application data.

We encourage you to explore the new Requests page and experience the improvements firsthand. Your feedback is valuable as we continue to refine and enhance Helicone’s observability platform.