Blog Planview

Votre parcours vers l’agilité métier

Produits & Outils, Chaîne d'intégration logicielle

Turning Software Delivery Data into IT and Business Intelligence

Publié le Par Jeff Downs
Turning Software Delivery Data into IT and Business Intelligence

Turning software delivery data into IT and business intelligence is nothing new; it’s the 21st-century gold rush. Yet getting to this level of maturity is a herculean undertaking, especially for IT organizations looking to obtain accurate cross-tool reporting that provides them with business-critical insights into their software delivery. For many, obtaining the one source of truth into what is going on through ETL and custom reporting—chiefly in terms of compliance—is extremely labor-intensive and getting more expensive by the day. There’s a reason that the data integration market is expected to be worth $12.24 billion by 2022 and growing at a respectable 13.7% CAGR. Data is the currency of business. 

The good news is that Tasktop can dramatically reduce these costs and eradicate much of that manual work. Planview Hub, by virtue of its integration and modeling capabilities, does all the heavy lifting: By normalizing and centralizing data from the software delivery toolchain, we can help IT organizations to improve the quality of their IT and compliance reporting. Even better, we can glean valuable end-to-end system data that can be used for higher-level business-centric Flow Metrics, empowering you to generate more business value through your software delivery. 

Crucially, Tasktop can enable Enterprise IT to overcome three key challenges in generating cross-tool custom reports:

Challenge 1: Cost of ETL

According to Accenture, the cost of data replication of one tool—ServiceNow—into a data warehouse for reporting can cost up to half a million dollars to set up and up to $300k annually to maintain. Given that most large-scale organizations use at least three main tools in their software delivery toolchains (for development, testing and support), the cost of data extraction and storage begins to stack up and fast.  Tasktop can automatically extract the data that matters from multiple data points in real-time, normalize that data and stream it into a centralized data warehouse for you to build your custom reports. So instead of blindly replicating a whole repository, Tasktop enables the tool admin to use point-and-click integration to store only a modeled subset of data that matters for the business, reducing data warehouse costs.

Challenge 2: DBA Reliance

Because data is stored in tools in different formats, cross-tool metrics require data normalization to make sense of it all. Moreover, unique (and unpublished) database schemas per tool require in-depth exploration that steals a lot of precious time from software delivery specialists who should be focusing on their job, not on wading through endless data lakes. And if that wasn’t complicated enough, some tools generate a database per project, meaning you’re dealing with hundreds of databases with nuanced schemas in each database. The sheer complexity and effort of this data work is mindboggling, and can explain why lead times fo building reports are so long; if it takes 2-3 weeks to build a report, its findings are most likely going to be out of date. Moreover, any upgrade to your tools or change to workflows will result in a lot of rework to accommodate for the changes to schemas, fields and values. 

Tasktop was built to normalize the data so your DBA’s don’t have to. Instead of specializing in cross-tool data normalization, their skills and expertise can be used to develop your products. Moreover, the data in the data warehouse is modeled into a standard format, so that building reports for business stakeholders is much easier, which significantly reduces lead time on report generation for sharper accuracy into activities and product status.

Challenge 3: Data Access

Getting the data out of the tools can be a challenge unto itself. As more and more tools are consumed as a service, SaaS vendors are protecting their shared infrastructure’s performance by throttling API calls. 

Owing to Tasktop’s connector technology—the broadest and more diverse on the market with nearly 60 connectors for the most commonly used tools to plan, build and deliver software—we have an extremely efficient way of extracting the information you need from the tools in near real-time. Our high performance, low impact connectors are meticulously designed to minimize the number of API calls and maintain the operational stability of your tools. 

Furthermore, many SaaS vendors purge data periodically to save on storage, limiting your access to historical data. Tasktop can create a history of your artifacts in your databases because we record changes to artifacts over their lifespan and store that in your data warehouse/lake/mart. 

Popular Reports Generated based on Tasktop Integration Hub’s Modeled Data Stream

Automating Traceability for Compliance

In the current world of enterprise software, traceability is the holy grail chased by many. But traceability can also mean very different things depending on who you talk to. For QA engineers, traceability will mean ensuring the completeness and thoroughness of the testing process. The expected result will be the ability to trace requirements with test cases, cycles, plans, resulting defects, etc. For system engineers or business analysts, any discussion of traceability will revolve around traceability matrices which will also tie into the testing pieces. For automation engineers, traceability will be more focused on the pipeline health and visibility into what’s happening there.

For many organizations, traceability is not a choice but a must-have as they are heavily regulated and may be subject to audit. Consider the medical device industry and what it takes to have a new device approved by the FDA: to receive regulatory approval, organizations must provide clear traces of requirements throughout their development lifecycle – not just initiation, but also through execution, testing, and maintenance. Thus, it is crucial to provide proof of what work is being done and where this work goes. The automated traceability provided by Tasktop can trace each requirement to the code that implemented it and the tests that verified it. 

Ensuring Tools and Practices are Being Adopted

Global enterprises continue to invest heavily in tooling and best practices. But how do you know if teams are really adopting them? Tasktop can help consolidate process and adoption metrics to know if practitioners are indeed using and benefiting from them. Reports based on Tasktop’s modeled data stream can tell you if developers are linking code commits to stories, if test cases are linked to the originating requirement or story, or if a new build process or tool is being adopted. 

Tangible Business Outcomes 

Cross-tool data can help to monitor activities that lead to tangible business outcomes, such as product quality. For example, when it comes to testing, tracking Defect Detection Percentage (DDP) can provide powerful real-time insights into your QA process and the total cost of quality.

Most organizations identify, log, and triage defects via multiple systems such as Jira, Micro Focus ALM, Azure DevOps, qTest Manager and Tosca.  At the same time, end-users are logging incidents in ITSM systems like ServiceNow and BMC Remedy. 

With no one tool to track all this work and multiple siloed data points, calculating your Defect Detection Percentage (DDP) (the number of defects found in the test phase vs. defects found in production, helping to assess the ROI of your testing investment)—on an on-going basis is an arduous undertaking. Leveraging Tasktop’s live data stream to gather data from numerous tools in real time gives your teams the visibility they need to make decisions about how to optimize investment in testing. 

For example, you can make crucial decisions around the total cost of quality. You could invest heavily in catching defects before production, which will slow down time-to-market. Or do less testing and get to market faster, where your customers will find your bugs. Metrics like DDP provide the insight needed to strike a happy medium between the two.

Status Reports

While the project to product movement continues to gather pace, most traditional organizations are not at the right maturity level to do away with established project management. That generally means project status reports to monitor and track performance and deliverables—or to be more exact— activities.  Tasktop’s live stream of data accelerates the time it takes to create a report as well as the quality of the data for a more accurate summary of how a project is progressing.

Access to the correct data in a timely fashion helps any company overcome a challenging technical hurdle.  However, that doesn’t prevent these reports from driving bad or ineffective behavior. To quote Eli Goldratt, “Tell me how you will measure me, and then I will tell you how I will behave. If you measure me in an illogical way, don’t complain about illogical behavior.” 

Tasktop Integration Hub – your lifebuoy in a sea of data

Tasktop keeps you afloat in the endless sea of data, simplifying how you collect, normalize and centralize data for better reporting

  • Cost-effective: through existing building blocks, models and collections, tool admins can easily extract data out of tools into a common datastore 
  • Low effort: on-the-fly data normalization to correlate data for automated traceability 
  • Easy data access: Live data stream across toolchain ensures you own your data in a SaaS world

It really is that easy. Let us prove it by starting an evaluation today.

Using live data streaming to build end-to-end traceability


 

Articles similaires

Rédaction du contenu Jeff Downs

After seeing first-hand how disparate tool suites can hinder an organization, Jeff joined Tasktop to help fulfill the company’s mission of connecting the world of software delivery. At Tasktop, Jeff works as a Principal Pre-sales Engineer to help companies in a wide variety of industries make better use of software development and delivery tools and enable collaboration across the software lifecycle—through the power of integration. Jeff also has more than ten years of hands-on testing and test tool experience at LexisNexis, where he led the transformation of their Test Center of Excellence. As a tool administrator, Jeff drove the effort to improve efficiency and capability through tool best practices, integration, and administration. Contact Jeff on Twitter @jdowns26.