← Back
May 30, 2023

Statistical Charts Now Help To Spot Trends

TL;DR

In Sprint 19, we caught up on our technical debt, squashed a few bugs, and positioned the team to start building priority features from the Bloomfilter backlog. We also ramped up our focus on design, creating the concepts that will help form the new home page for signed-in users. Exciting updates are coming your way—stay tuned!

Sprint goal

Based on feedback from early adopters regarding our project statistical charts, we set a goal to quickly update the design and data visualization of historical views of time-based charts. Now, you can get a 16-sprint view of important process measures like lead time, reaction time, throughput, velocity, and cycle time.

Updates

Customer facing

  • We’ve improved the detail-level view of our project statistical charts. Now, you can track historical process results as far back as 16 sprints! Plus, we offer trailing averages over three-month, six-month, and 12-month periods to help you grasp how your time-based statistical results are evolving over these key product development timeframes.
trailing trends
An updated view of project statistical charts, going back as many as 16 sprints with trailing trends.

  • You can now supply Bloomfilter with complexity estimates using either the “Story Points” or the “Story Point Estimates” fields in Jira.
story point estimates
We can now interpret either “Story Points” or “Story Point Estimates" fields from Jira



  • We removed the “green band” from statistical charts on the Projects page to eliminate confusion caused by wide band ranges, resulting in cleaner, easier-to-read bar charts.

statistical charts
Clean, easy-to-read statistical charts on the Projects page.

Non-customer facing

  • We’re making Bloomfilter’s data picture more dynamical. We successfully migrated Cycle Time, Lead Time, and Reaction Time to a task graph, which is more aware of real-time than static date stamps in Jira and GitHub.

What we fixed

This sprint, we tackled a number of minor fixes, including the consistency and performance of our project board dropdown selector, our project statistical chart labels, the historical calculation of cycle time, and tightened syntax in our welcome email.

welcome email
August 1, 2023
Testing New Features and Market(place) Fit
Introducing new AI driven capabilities (called Bloomcast) and dynamic forecasting with Confidence Chart. We are tracking work with greater precision.
Read More →
July 11, 2023
Burn Baby Burn
We’ve previously talked about our Burn Confidence band, which measures team performance as compared to expectations in a static representation. However, the development lifecycle is constantly changing and requires teams to respond dynamically. We are excited to announce that our latest update to the Burn Confidence Band transforms the band into a dynamic forecasting tool based on a rolling sequence of burn data from past sprints.
Read More →
July 11, 2023
Burn Baby Burn
We’ve previously talked about our Burn Confidence band, which measures team performance as compared to expectations in a static representation. However, the development lifecycle is constantly changing and requires teams to respond dynamically. We are excited to announce that our latest update to the Burn Confidence Band transforms the band into a dynamic forecasting tool based on a rolling sequence of burn data from past sprints.
Read More →
July 6, 2023
The Big Summary and The Sprint Performance views coming this July!
In Sprint 22, two key features are in User Acceptance Testing (UAT): a new Summary page as the app home screen and an update to our frequently used feature, Sprint Performance.
Read More →

Start aligning your business and tech teams with insights in about a month.

Request a demo