CSBI Course 4: Business Intelligence Technical Skills
- The Basics
- Getting It Done
Reflect what you have learned
Certified Specialist Business
Intelligence (CSBI) Reflection
CSBI Course 4: Business Intelligence Technical Skills
● The Basics
● Getting It Done
Business Intelligence Technical Skills
This course provides an overview of business intelligence technical skills, including how to apply the concepts and tools of BI to understand what to work on and how, as well as how to use databases and related report-writing applications. The section also provides explanations of business intelligence applications and approaches that are used to derive information from large clinical, financial, and other databases to support better decision-making. Methods of presenting and displaying information in a clear manner are also discussed.
Application of Concepts and Tools to understand what to work on and how to do it
The work of the BI/Analytics consultant needs to move beyond the development and production of routine daily reports, analysis and responses to information requests that come into the office.
Through use of appropriate data management, analytic tools and approaches, organizationally significant areas or items can be identified that should be targeted or monitored, many of which may be currently missed or under-scrutinized, identifying these significant areas reveals the work that needs to be engaged. One needs to frame the questions and guide users to focus on the most meaningful analyses.
With the healthcare marketplace calling for greater value from all players, delivery of BI/analytics information must support high levels of organizational achievement related to the five Power Decision attributes.
Future success will rely on decision-making that leads to repeatedly doing things better( with precision and accuracy). Decision-making needs to be expeditious in a way that achieves the best clinical outcome and/or business operational result. Success means being faster than the competition and being able to quickly handle differing demands of individual healthcare users and the general marketplace in the consumer-eccentric world(agility). And cost makes a difference, information that is easily understood and represents one truth is needed to support this.
The goal is to obtain information that addresses the questions or issues at hand. We can refer to this as one truth,and this one truth needs to be clearly conveyed when it matters, when a decision is required.
The BI/Analytics consultant knows how to achieve this. It is not a simple matter of item selection from a drop-down menu. An example would be requests related to department productivity. Calculate productivity using finance system data, and one number is attained. Ask the department manager and they have another number from the scheduling system. A different number is supplied when the time clock and payroll data are accessed. Whether these differences are small or large is beside the point. Three numbers are no longer acceptable.
Revealing Items of Significance
Achievement of one truth happens through organizational decisions that reveal the one truth for the organizations as the data architecture and warehousing systems are designed and implemented. The selection decisions on what to extract, translate and load are crucial because these lay the foundation for the data that become one truth. There are trade-offs that one must be aware of. For example, warehouse data may not be as good as raw data for some purposes,yet raw data may not be available or accessible. One may need to meticulously tie and reconcile this raw data to that one truth.
Using database and related report writing applications
Historically, healthcare provider organizations’ use of databases and reporting has been narrowly directed and concentrated, with concomitant issues, as will be noted in the six points. This type of database use should not be hastily dropped, as there is value in at least some of the output generated which will be considered later. However, such uses are insufficient to accomplish the tasks needed in the new healthcare business environment.
Analysis, Analytics and Business Intelligence
The BI/analytics work traditionally performed, and much of the work done today, continues to be a necessary decision-support function involving the framing and formatting of reports and relatively basic descriptive analytics, if any analytics, if any analytics at all. Added more recently to the output of this work are visualized reports. These visualizations are available as organizations adapt new reporting tools. They may or may not represent a move into analytics, depending on whether they bring new depth and breadth revealing insight or foresight or do not improve decision-making precision, speed or consistency. The visualizations should not be confused with engaging analytics.
This work in healthcare often is still performed within the silos of individual process applications. Limitations revolve around the embedded report writers, which have little, if any, capability for depth and breadth of analysis. The output of these applications is typically designed for simple descriptive reporting or extracting data electronically, yielding simple columns for fields and rows for such applications may be based on legacy system architecture and may not be able to be accessed by newer sophisticated tools.
Most of this work has been financially focused in nature. For financial statements, budgeting and monitoring, regulatory reporting, KPIs, etc, organizations use financial system applications such as general ledger, billing transactions, and applications such as order entry to examine areas such as volumes, procedures, or timings.
This work is typically based on performing recurring, routine reporting and analysis, such as financial statements, cost reporting, KPIs and variances and recurring periodic activities,such as budgeting and ad-hoc requests.
Regardless of the approach(siloed and manual reporting or a more integrated approach), the output is typically of a simplistic, descriptive, analytic nature providing quantities, comparisons, percentiles and maybe trend lines on graphs. Remember that descriptive analytics includes some more powerful tools for understanding the data-frequency distributions and standard deviations, combinations, cross tabs(Pivot tables), scatter plotting and others. Routine reporting does not use any of these tools. Without such tools, there is limited or no ability to understand the magnitude, depth and breadth of a situation that needs to be understood. Indeed, the valuable methodology needed to unlock high-impact information should not be engaged as part of “dashboard” display, as this type of information does not work well in this format.
Most often, application report writers are insufficient to reveal the information needed for powerful decision-making. A good deal of a BI/analytics consultant’s time is spent manually analyzing extracted data with individual files or manually compiling data from various systems for analysis inside other tools.
Viewpoints and Decision-making power: Example-Agency versus overtime
The power is derived from the fact that decisions in the future must be made in alignment with the five key decision attributes depicted to the right. This type of decision making is demanded in a cost-conscious, resource-constrained, value-starved, hotly competitive, turbulent environment, such as health care today.
Using the Database and related report writing applications
However, this may still be insufficient. The next section will display an ordered approach to taking advantage of the leverage that descriptive, predictive and prescriptive analytics can provide for more powerful decision-making and success in healthcare service delivery in any environment.
Some organizations are on this path, as reform takes hold and meaningful use achievements become more sophisticated. There are difficulties as movements are essentially sequential, however, not strictly consecutive. The approach is iterative, explorative and experimental and requires looking at differing positions on the path simultaneously. Yet, by jumping ahead or failing to consider each point on the path,one can miss vital knowledge needed for strong construction of the later positions.
Getting it Done
Monitoring and Planning
One naturally first considers the situation in terms of what data elements might be required.
Consider the IT applications in which such assumed data might reside-the individual process application, a related process application, or a secondary storage location.
Next, we should evaluate the capability and possibility for accessing, reporting and/or extracting information from the application of the data’s residence.
● Process applications often suffer the problem of very limited reporting functionality and capability. This is the case related to extraction, as well. So, is the data available, accessible(within the timeframe called for or real-time), and in a useful format using those tools. Is that tool available? What is the staff capability to use the tool? If not, is the data in the warehouse? This is the key issue that needs careful attention. Construction of the warehouse is complicated. Storage costs, while dropping Dramatically in recent years, there remains an additional expense that is always incurred
begrudgingly and usually a case must be made for including each data element. Further in making this case, certain functions are favored(e.g. RevCycle) and once a data element is no longer needed for the favored function it might be dropped from the warehouse for cost reasons. Thus data elements sought for reporting and analytic purposes are not available over the long term. For instance, the warehouse of a major midwest system droppedED time in/time out elements after 60 days, thus making longitudinal LOS studies and other insight and foresight questions far more difficult to engage. This Is not irreparable, yet a time-consuming new set-up must be engaged.
An alternative to the above might be the related process application. However, the same questions would apply.
If the alternatives on the previous page do not prove successful, then the question of a secondary storage location(s) should be explored. When moving in this direction, be especially careful to consider what is contained in such a location. The first question, among others to ask is: Does the location contain all the data elements of the process application?
Assuming one can obtain data through an approach discussed in the previous pages and the data is in-hand, it must be determined if this the data needed to address the question? Often when the initial data is retrieved, a new set of questions arises. For example, is other data needed that was not initially thought of or considered important? Often additional or alternative data elements are identified. Also, one must check the veracity of the data. The data from the source chosen may lack the completeness, integrity and reliability needed to achieve the one truth sought.
It is important to check the work. Is this what is expected? Are the results due to the way the work was performed? Are there duplicate records, empty fields or “strange” data that are possibly due to a setup of the query, rather than a problem with the data?
The key point here is that if questioning of this sort does not arise naturally, then a critical aspect of the work is to make sure these questions are asked and examined.
During the questioning process you will likely need to engage in stakeholder analysis and process flow charting, whether or not you have engaged either of these when considering the initial data requirements.
It is not required to engage stakeholder analysis and process flow earlier, however at least a cursory review of the situation considering stakeholders and processes is called for now and should not be ignored. Without such review, important details may be missed that are crucial to success.
What is being encountered at this point is the iterative and experimental portion of performing this work. Initial data is obtained, raising questions that may require new data pulls, which means it will be necessary to go back to the points we’ve covered, all the while considering everything in light of the shifting work in understanding stakeholders and process(which may evolve as more questions are asked and iterations occur).
This is the true nature of a good deal of the BI/Analytics consultant’s work. Simple reporting will not suffice over time. Along with complexity comes the need for richer, more complex information that involves more sources and more players and may be more demanding to produce.
Stakeholder Analysis is essential
The stakeholder analysis is essential for fostering transparency and breaking down barriers that inevitably arise when one starts asking questions about data. This is especially the case when the veracity of data is questioned. Remember, the BI/Analytics consultant is most likely an outsider, not a process owner, and will not be considered to know anything about “the way things really work here.” In such a situation, working with those responsible for data accuracy and integrity can require great sensitivity. One must know who the players are, understand their interests and have a plan for connecting with them in meaningful ways. One must understand wh has the power and who can, and will, become champions.
Process flowcharting is essential
Process flow is essential. First and foremost, it shows what you are working with, which may be different than what you should be working with. Maybe the process is different than verbally described or initially considered. Second, process flow charting allows all parties involved to ask questions and participate in the process so will have a visible and mutually agreed-upon view of the process. This allows everyone to come to an agreement on what the data is or where it comes from. The transparency and collaboration fostered in this process would likely be an important collaboration fostered in this process would likely be an important part of any stakeholder plan. Process flow charting, in concert with the other points of this approach, ensures identification of relevant questions and issues to be addressed.
One might ask at this point
Why not engage stakeholder analysis and process flow charting right off the bat and be done with it? And this is a good idea. However it is not required initially. Once data is in hand, one would look at the reality of the data and would need to engage stakeholder analysis and process flow regardless. This ensures that the right data is obtained to address the need; that it contains what it was thought to contain and whether additions or alternatives are needed.
Monitoring and Planning
Reporting and descriptive analytic tools are widely available from vendors of all size and scale. Automation of functions and processing, ease of use and display of information are critical selection criteria.
If line staff are the ones who have to use the tools, they need tools that fit their workflow rather than alter it. They will not be able to use tools that add their daily task list. At a minimum, the situation needs to be neutral related to the workload. At best, the tools should save time, require less work and display information in a way that is immediately easy to understand and useful. The tools need to add precision, accuracy and speed to their decision-making.
Operational Transaction Focused Real-Time Decision-making
The steps in the situation are largely the same as in monitoring and reporting. The difference is that the organization should want to leverage predictive and prescriptive analytics for real-time decision-making. Decisions made in this way achieve high performance in relation to the 5 power decision attributes. Enabling and embedding these types of analytics into real-time operations allows the organization to achieve real results.
The 5 key power decision attributes are:
1. Targeted Results 2. Replicable-done in the same way, repeatedly 3. Adaptable to differing circumstances 4. Expeditious 5. Low cost
The difference in applications of analytics in operational situations is understanding the requirement for real-time access to data that is refreshed and updated. Where this is required , warehoused data cannot be used. Warehoused information is old, even if only for a few minutes. The access requirement means that only data from the process applications is useful. If there are difficulties accessing, refreshing and updating data, which could be the case with proprietary and legacy systems, one may not be able to go further with the project. This situation is not nearly prevalent, as it has been even in the recent past, however, one should always be attuned to the level of real-time access possibility, and drive for achieving greater levels of real-time access in order to assist in real-time decision-making.
Not all operational situations require this real-time interactional interface. Through process flow charting, one will understand the operational processes-where, how and when data is available and used. With this information, the immediacy of transactional can be known. Those operational situations that are not an immediate transaction can use warehouse data.