RSS

Posts Tagged ‘decision enablement’

The Four Degrees of Understanding

Posted by

If people are connected by six degrees of separation, then I contend that raw data and true IT wisdom are connected by four degrees of understanding.  Let me explain. (And this won’t involve Kevin Bacon at all. Promise.)

For the past 25 years I have implemented ERP systems that essentially are a historical picture of what transpired.  For most of that time, I have challenged people to make your accounting data predictive, not historical.  I recognize that you must always look back to report your results, but using “data” to predict what is about to happen, and then knowing how to react is a whole new ballgame.  Some would say that is the role of BI.  Maybe it is. At some point, data should do you the favor of becoming information.

So, I explain four degrees of understanding this way:

  1. If you have a set of numbers, but you don’t know what they represent, you have data.  For example 23,000 and 25,000.  They are meaningless until you know what they represent.  Your systems have tons of data, but how do you use it?

  2. The second degree is called information.  If I tell you what those numbers mean, then you have some understanding of what you are looking at.  For example, if I tell you that they are Daily Sales, then you can start to analyze that data to make some judgements.  Sales went up? Maybe.

  3. The third degree of information is knowledge.  Knowing what the data is and knowing what to do with it gives you knowledge.  In this scenario you may ascertain that sales went up, and that might provide meaningful insight.  For example, the “buy one get one free” incentive raised sales for the day is something you might ascertain from that information.  With that knowledge you can make judgments about how effective the program was.

  4. The fourth degree of information is wisdom.  Being able to take that knowledge and predict in a meaningful way how you should react to certain situations is the wisdom that makes many men (and women) respected leaders.  Does it make you visionary?  Maybe.  Does it make you a genius? Sometimes.  Does it earn you the respect of your peers? Usually.

There are thousands of examples every day of how people use the wisdom they have accumulated after years in business to make educated decisions.  Not all of these are accurate decisions, but most are not done “by the seat of your pants.”

I try every day to lead customers to understand how accounting “data” can be used to generate wisdom.  It is never easy, and it usually leads to more questions than answers, but understanding the roadmap to move data to create wisdom is a great place to start.  Having the ability to act on that wisdom and see the results is empowering.  What is holding you back?

PeopleSoft Strategic Sourcing: Total Cost Modeling

Posted by

We discussed in my previous blog about bid factors weighting and scoring. Today, we will review total cost modeling which allows bids to be analyzed based upon price, best score and lowest total score.  In simple language, total cost modeling allows the flexibility to make the best decision for award.  PeopleBooks has some good information and easy to understand examples for this, so we will use those examples to demonstrate how total cost modeling works.  When we get to the point of discussing creating events, bidding on events and ultimately analyzing events for award, we will see and demonstrate how total cost modeling functions.

Here’s what PeopleBooks has to say:

By using total cost modeling this feature we can designate cost contributions for bid factors. Depending on the type of bid factor, costs can be calculated based on the bidder’s bid price, the bidder’s bid quantity, a predefined cost range, or a user-defined cost. The system can then calculate a cost related to each bidder’s response to a bid factor, as well as total line cost and total event cost. This information can then be used either during manual analysis, or by the optimization engine to determine an ideal award.

The following example illustrates how the cost modeling can be used. You are purchasing an item that has a warranty bid factor associated with it. You are asking the bidders to indicate the length of warranty provided for the item, with a range of one year to five years. The longer the warranty period provided, the less your organization will need to pay for maintenance and repair costs. You determine that each extra year of warranty provided saves your organization $50 in maintenance and repair per unit. You can assign this cost to the warranty bid factor so that the total cost for this bid factor will be calculated based on the bidder’s response. One bidder may bid $1,000 per unit but only provide a one-year warranty, while another bidder may bid $1,100 per unit but provide a five-year warranty. Even though the first bidder has a lower bid price, the second bidder will have an overall lower cost because the bidder is providing the full five-year warranty.

Here is another example that we will actually show and demo in a future blog.  It is an easy scenario to understand with only a header bid factor and the price bid factor at the line level.  In this scenario we have one question at the header which is a warranty question.  The best warranty is 5 years and the worst warranty is 1 year.  For each year of warranty less than 5 years, it costs an additional dollar.

  • Vendor A:  Bids $20 and has a 5 year warranty.
  • Vendor B:  Bids $20 and has a 3 year warranty.
  • Vendor C:  Bids $20 and has a 1 year warranty.

In this simplistic scenario, Vendor A would have the best total cost at $20.  Vendor B would have a total cost of $22.50 and Vendor C would have a total cost of $25.  All other factors aside, if cost and warranty were the only bid factors, from a total cost standpoint, vendor A would win.

Here’s a screenshot that hopefully illustrates this in Strategic Sourcing:

(Click to enlarge)

When we start discussing creating and analyzing events in later blogs, we will show where you setup the cost factor associated with an event.

Note that for auction events, bidders can compete based on score or price only.

More next week. In the meantime, drop me an email with any questions or comments.

Flash DEMO: OBIEE Exalytics for Retail

Posted by

Previously, we have discussed the real power of BI — making business decisions out of the oodles (that’s a technical term for a lot) of data within organizations), embedding BI into the business process and processing that data extremely quickly and efficiently.

So, while we have talked about it and tried to explain the concepts and value, there is nothing like seeing it in action.  To that end, Oracle has put together a very nice flash demo of OBIEE Exalytics in a retail environment.  The example is easy to understand and shows the power of BI and using it to properly “manage” the business (remember our adage, ERP allows you to “run” your business, but BI allows you to “manage” it).  It is clear to see how BI can positively influence inventory levels, then sales, then the profit potential of this company.

Check out the demo here, and as always, email me if this is something of interest to you.

###

More links:

MIPRO Consulting main website.

MIPRO on Twitter and LinkedIn.

About this blog.

Converting Your Reporting to Oracle BI Publisher

Posted by

We’re seeing this more and more: as Oracle’s OBIEE platform gains steam, many organizations are making the move to convert existing reports from Crystal, Actuate or Oracle Reports to BI Publisher. Report conversion can be a daunting task, not necessarily from the technology move, but moreso from simply understanding, cataloging and consolidating the current collection of reports.  Over time, many reports are modified, added, and personalized, and it is very easy to lose track of what you have — and more importantly, what is valuable.

Oracle has a very good white paper which provides all sorts of great information regarding the conversion process.  Probably not unlike you, when I need to know something that I don’t necessarily have experience with, I often look to white papers and red papers to give me that initial push — and often I am disappointed.  I need to know a strategy, I need to know what type of skills I need and how much effort it will take. In so many cases, this information is what is most obviously missing.  Well, this white paper outlines a strategy, outlines key considerations, provides several company examples and outlines the resources that are required for success. I highly recommend it to all of my clients. Entitled Planning to Covert to BI Publisher, you can check it out here (PDF link; you’ll need your Oracle login and password).

Here are few of the key elements I took away from the white paper:

  1.  Prior to any report conversion activity, you must evaluate your current catalog of reports.  Determine what reports are being used, what reports are important and what reports should be carried over.  Don’t forget to evaluate the new technology and capabilities of Oracle BI Publisher.  Old technologies may propagate the need for multiple reports because of only slight variations in requirements.  There is an opportunity to reduce the overall number of reports simply based upon the capabilities of BI Publisher.  Also, challenge the end users on the needs of the reports.  If a report has not been run in a long time, is it really necessary?  Remember, end users can also create their own reports which may impact how many reports must be converted.
  2. You can categorize reports:
      • By data
      • By complexity
      • By size
  3. Understand that there are conversion tools to help with the move to BI publisher.  Certainly manual conversion is an option, but if any time can be saved with the conversion tools, that option should be explored.  I have not used these conversion tools so I do not have direct experience as to how well they work if they convert 100% or 50%. I would recommend you take a sample of low, medium and high complexity reports and evaluate the success rate of the conversion tools when determining your approach. If you do this, I’d be interested in your results. Drop me an email.
  4. The white paper provides some customer examples and general rules of thumb on the level of effort.  From the white paper: “A general thumb rule would be 3 man days for manually converting a simple to medium complex report and 5 man days for converting a highly complex report. The time taken will reduce over a period of time with experience in conversion. Using the Conversion Tools, the conversion time would be reduced tremendously and may require few hours to 1 man day for converting a simple to medium complex report. For converting a medium to highly complex report, the effort may vary between 2 – 3 days. The time taken will reduce over a period of time with experience in conversion.”
Report conversion is often a daunting task that prevents organizations from moving to a more integrated/streamlined reporting option from what they are using now. That doesn’t have to be the case. If you want to explore this option more — or even just pick my brain about it — I’m always happy to chat. Let me know.

###

More links:

MIPRO Consulting main website.

MIPRO on Twitter and LinkedIn.

About this blog.

Piloting the Power of Business Intelligence

Posted by

In several of our previous blogs we have discussed the proper way to build a business intelligence enterprise solution.  Many of you have responded in agreement with much of the content and the steps required to truly unlock the potential of BI.  However, there are a number of readers who have reached out and asked for ways to promote BI within an organization that does not have a deep understanding or appreciation of what business intelligence can do. In other words, those that still view BI as mere reports.

Promoting BI certainly is a challenge if the organization has a lack of appreciation for what business intelligence truly represents.  However, there are still proven methods to promote BI within the organization.  The key way to do this is to find one area of the business that has acute pain, secure an executive sponsor who needs that pain resolved, and pilot the power of business intelligence.  Done correctly, the executive and business unit will become BI evangelists, spreading the word internally and driving the desire for business intelligence organization-wide.

The key to this approach is to deliver results in a very quick fashion.  Contrary to what has been preached previously on the holistic approach to BI, this will require an iterative development process, so you still need to be careful to treat this as a pilot and make sure the pilot does not become the foundation of the organizational BI structure and unknowingly paint yourself into corners that will later result in much rework.

In summary the steps to this approach include:

  • Find pain in the organization.
  • Secure an executive sponsor that requires a solution to that pain.
  • Take an iterative approach to development.  The first delivery may not be perfect, but show speed of response. (Also don’t worry about getting every BI requirement correct the first time.  Let the process develop the requirements. )
  • Demo the solution — perhaps start with a simple dashboard (see our earlier blog on how to build a proper BI dashboard) as this will help displace the myth of BI as a simple report.
  • Tweak the solution based upon feedback.
  • Once tweaked, let the business begin to utilize the BI solution, appreciate it and evangelize. Once it realizes BI truly helps enable more mature decision-making, that’s when buy-in occurs.

Hopefully, done well, there will be an influx of requests for increased business intelligence analytics across the organization.  Once that need is established, you can take the proper steps to build out your enterprise BI program.

Questions about this? I hear them all the time. If you have any, don’t be afraid to reach out.

###

More links:

MIPRO Consulting main website.

MIPRO on Twitter and LinkedIn.

About this blog.

 

How Strategic Is Your BI Solution?

Posted by

In the late 90’s all of the buzz talk was around ERP, and understandably so.  A lot of the buzz I hear today is about Business Intelligence (BI).

There are probably a variety of reasons for this.

Chief among them, in my experience, is the notion that ERP was pervasive in the 90’s because of the need to replace antiquated accounting systems to solve an imminent Y2K problem.  BI, on the other hand, is pervasive because everyone now has an ERP type solution and they want more.  But what does “more” mean?

One of the great differences that I have talked about in the past is the need to take an ERP transaction-based system that reports historical information and turn that into useful, quantifiable information so you can plan the future and make real-world decisions.  For the first time, I have found an article that clear puts a correlation to the two views.

In his article entitled 6 Habits of True Strategic Thinkers author Paul J.H. Schoemaker describes characteristics and traits of executives who would like to spend more time focused on strategic thinking rather than then day-to-day activities of their organization.  Interestingly enough, these characteristics are the very same traits that separate an accounting solution that reports history from a solid BI solution that allows you to make perceptive decisions about your business.  These traits include:

  • Anticipate
  • Think Critically
  • Interpret
  • Decide
  • Align
  • Learn

You can read the full article over at Inc. online.

I encourage you to do so at the behest of two questions you should aks yourself:

  1. Are you a strategic leader?
  2. Do your systems provide the information (not merely data) and knowledge that allow you to be a strategic leader?

If the answer to either of these is no, it may be time for a change.

If you’re interested in BI fundamentals and how we look at BI implementations and their value, don’t miss our whitepaper, MIPRO’s Business Intelligence Manifesto: Six Requirements for an Effective BI Deployment. And naturally, if you have questions, want to tell me I’m wrong, or just go over what you’ve been kicking around in terms of BI lately, don’t be afraid to drop me an email.

###

More links:

MIPRO Consulting main website.

MIPRO on Twitter and LinkedIn.

About this blog.

What Is ‘Real-time Data Warehousing’?

Posted by

I get asked the following questions often:

What is real-time data warehousing?

Is there such a thing as real-time?

Who defines ‘real-time’ anyway?

Is real-time really just ‘near-time’?

It comes down to this concept: the ability to have valuable, quality data in your data warehouse as quickly as that data is generated within your operational system can be very powerful and beneficial to your organization.  This concurrence is the crux of real-time data warehousing: getting production data into your data warehouse as it’s created.  Easy to say, harder to do. But, since we are neck deep in the digital age, everything is possible. Imagine what it would mean to have these capabilities in a physical warehouse… It’s like having scissor lifts that can store and deliver items being created in another country, in real time. So, in a way we are lucky to have this digital problem to solve.

There is an excellent whitepaper at The Data Warehousing Institute which discusses in detail real-time data warehousing, data integration, data federation and data virtualization.  It does a nice job of covering some options and provides some key thoughts to consider if your organization is working its way toward real-time data warehousing.  I highly recommend it.

In addition to the insights provided in this whitepaper, I would also propose that real-time data warehousing can be that final step which allows an organization to achieve true BI excellence. In my experience, it’s one of the last hurdles a company has to jump.

While many organizations have a data warehouse, most of them still do not get 100% of their decision-making data from their data warehouses. The lack of real-time data is one of the key reasons that is true – the data warehouses are incomplete, even if just by a few hours of data age.

For example, if you run a high-turn inventory or manufacturing system, you cannot have inventory levels for either raw materials or finished goods which is hours old. For that reason, many organizations still will go to the transaction system for some of its time-sensitive inventory data. That’s the data freshness needed to make real decisions.  If it’s hours old in the data warehouse, it’s nice and all, but it’s academic.  Nobody trusts it for heavy-duty decisions.

If a company is swapping back and forth between the data warehouse and transactional systems for information, it is possible (certain?) to run into training and tools issues. Training questions arise such as when should I use the data warehouse vs. the operational system?  Why? If I use the operational data, do I use the native query and reporting tools or do I use the enterprise BI tools? Does this organization have to now support multiple tools?  Which ones?  Is it faster and easier to use the native tools instead of the enterprise BI tools? Will data conflicts between data warehouse data and operational data arise?

Moving to real-time data warehousing and providing a source for all data needs can alleviate these issues organizations face. Truly utilizing one enterprise BI tool, training and supporting that one tool can be a tremendous advantage. It is a challenge to accomplish, but the article at TDWI does provide some good insight to get moving in the right direction.  We have also helped clients achieve that ‘last-mile’ of data warehouse efficiency so that the DW can be relied upon as the platform-of-record for all decisions, mundane and major alike.

If you are working towards a real-time data warehouse but are struggling to close the loop on the real-time part of the equation, we can help.  Please email me and I’ll be happy to get in touch with you.

Any other questions?  Please ask away in the comments.

###

MIPRO Consulting is a nationally-recognized consulting firm specializing in PeopleSoft Enterprise (particularly Enterprise Asset Management) and Business Intelligence. You’re reading MIPRO Unfiltered, its blog. If you’d like to contact MIPRO, email is a great place to start, or you can easily jump over to its main website. If you’d like to see what MIPRO offers via Twitter or Facebook, we’d love to have you.

More business intelligence posts.

Building an Effective BI Dashboard

Posted by

In our BI manifesto whitepaper, we discuss and outline how to effectively build a dashboard and avoid common mistakes.  Dashboards have a very specific use and should not be confused with reports; this in itself is a common conceptual mistake.

The following are key elements of a well-designed dashboard:

  • Dashboards should be designed to answer or provide insight into specific business questions or issues.
  • Typically dashboards contain 5 to 7 key metrics.  Too many metrics clutter the effectiveness and focus of the dashboard.  Conciseness is important.
  • Dashboards should allow you to drill back to the source data for further analysis.
  • Dashboards are not simply another way to graphically represent a report.
  • Dashboards are designed for a specific audience level.  Examples include Supply Chain VP, Inventory Manager, Plant Manager.
  • Each dashboard at the audience level should support the next level up.  For example: the plant manager will likely be interested in metrics and analysis which support the metrics and analysis of the supply chain VP.  Likewise, the inventory manager will be interested the inventory metrics which support the metrics of the plant manager. Certain metrics may be more granular, but if done properly, all metrics are in direct alignment.
  • Dashboards should contain metrics with thresholds so users can instantly see if there are issues which require further investigation.  These include the red, yellow, green dials and meters which show metrics in or out of tolerance.  Immediate understanding of a metric’s disposition is critical.
  • Alerts are typically used to proactively message to key users when metrics are out of tolerance allowing the proactive investigation and resolution of issues.

Putting these elements into practice, we can look at a sample supply chain dashboard below.

BI supply chain dashboard example

Imagine the supply chain VP logging in each morning to their supply chain dashboard.  At a glance, using dials and alerts, they can immediately identify those elements which are not within tolerances.  These may be at a specific plant or facility level, or even a consolidated view of the facilities across the country that at a summary level are out of tolerance.  The VP can drill down for investigation or simply contact the facility manager for explanation or action.  This allows action to be taken prior to significant impact to the business.

In a future post, I will illustrate the power of “what-if” analysis and dashboards that allow for business benefit decisions to be more intelligently made.

###

MIPRO Consulting is a nationally-recognized consulting firm specializing in PeopleSoft Enterprise (particularly Enterprise Asset Management), Workday and Business Intelligence. You’re reading MIPRO Unfiltered, its blog. If you’d like to contact MIPRO, email is a great place to start, or you can easily jump over to its main website. If you’d like to see what MIPRO offers via Twitter or Facebook, we’d love to have you.

More Business Intelligence posts.

BI Sidebar: Master Data Management and Data Quality

Posted by

We have discussed previously the importance of data quality to business intelligence. I’d like to go a bit deeper on that topic here in this post.

It is plainly obvious that your business intelligence will only be as good as the quality of your data; this, in turn, means the quality of business answers derived from your BI logic are also dependent on the data’s quality.  It’s the cliché garbage in-garbage out scenario.

For the sake of a quick review, let’s recap several important reasons to implement a data quality strategy, including:

  1. The ability to trust data
  2. Accurate and timely information
  3. Compliance
  4. Security

While we have discussed data quality as it pertains to business intelligence success, we have not touched on the tight interaction and necessity of a strong master data management (MDM) solution.  A strong MDM solution is designed to consolidate, enrich, synchronize and cleanse data across all applications of the enterprise.

Imagine a distinct CRM solution, a distinct order management solution, a distinct inventory solution, a distinct billing and accounts receivable solution. All of these products have at least one key and critical element in common: the customer. If there is no mechanism in place to control the entry, deletion or editing of customers across these solutions, the business intelligence implications are absolutely perilous.

Take, for example, the simple customer ID.  If it is not the same across all of these distinct-but-interfaced solutions, the reporting and ability to trust information will be seriously compromised. Unfortunately it is not an uncommon scenario in which business units within the same organization describe products, customers, vendors etc. with different descriptors or IDs. These organizations end up with data quality issues
or some combination of transformation tables designed to sync up the different naming and identification conventions.

It can be a nasty mess, full-stop.

A MDM solution with a single point of entry and control reduces the risk of uncontrolled and unmanageable differences in master data.  When data is cleansed upon entry, the data has a significantly better chance of remaining clean and synchronized across the enterprise, which results in better business intelligence.  MDM solutions hold the single source of the truth across key business dimensions such as customers, vendors, products and locations.

As your business intelligence strategy matures, investigate the potential need for and implementation of a MDM solution to better control the quality of the data and therefore the quality of your business intelligence solution. I encourage you to learn more about Oracle’s MDM solution, and if you have further questions or want to know more about this topic, email me.  I’m happy to help.

###

MIPRO Consulting is a nationally-recognized consulting firm specializing in PeopleSoft Enterprise (particularly Enterprise Asset Management), Workday and Business Intelligence. You’re reading MIPRO Unfiltered, its blog. If you’d like to contact MIPRO, email is a great place to start, or you can easily jump over to its main website. If you’d like to see what MIPRO offers via Twitter or Facebook, we’d love to have you.

More Business Intelligence posts.

©2017 MIPRO Unfiltered