IT Consulting

Business Intelligence and Data Mining

Data mining is the concept of digging into a database to extract patterns useful to a wide range of profiling practices (eg. marketing actions, business strategies, sales technics corrections) and Business Intelligence is the tool that will allow to achieve just that.

OLAP (On-Line-Analytical-Processing) Cubes and pivot grids are more tools that will allow not only to better analyze the past but also to project complexe future scenarios.

An OLAP (On-Line Analytical Processing) Cube is a new data structure better suited to respond to Data Mining and Business Intelligence needs.
The arrangement of data into OLAP Cubes overcomes the limitation of relational databases (performance and number of informational view points).

Data mining is a process of digging into a database to extract patterns useful to a wide range of profiling practices (eg. marketing actions definition and correction).

Business Intelligence (BI) is a technique used to dig-out and analyze business data (eg. Sales revenue by Products, Vendors, Territory, etc.).
So far, we were relying on relational databases to perform these tasks and while the traditional two-dimensional results (eg. All Products by Vendors) could be obtained with relative ease and performance, the extention to three or more view points was often (if not always) accomplished through lengthy, costly, and challenging intermediary steps.

Data Mining Extensions (DMX): As SQL is the query language for relational databases, Data Mining Extensions (DMX) is the query language for Data Mining models.
Data Definition Language (DDL) and Data Manipulation Language (DML) are both part of DMX and allow the management and analysis of data mining models.

Achieving and Maintaining Data Quality

Today's technologies allow for high quality solutions; intuitive user interfaces, elegant look and feel, everything to give the end-user a top of the line experience.

It seems, however, that while the front end is getting all the attention, we have forgotten about the most important element, the one without which all of the above would have no reason to be: the data.

Certainly excellent solutions out there can handle data integrity and some aspect of "data cleansing" but what about those uncontrollable situations such as those due to duplicate, incomplete, inaccurate, or inconsistent data input?

SQL Server Integration Services (SSIS)
Microsoft SQL Server Integration Services specializes in ETL (Extraction, Transformation, and Loading) transactions.
Although the Extraction (gathering of the data from different sources, in different formats) and the Load (redeployment of the data) are very important, the Transformation (the “T” in ETL) is the step that handles the cleansing, standardization, join, merging, and other data quality improvement tasks. This step, the Transformation, will be the focus for this white paper.

Microsoft Dynamics CRM

From releases 1.25 and 3.0 – which allowed no or very little customization – we have seen a distinct increase of the customization capabilities of this remarkable piece of software.

Since the release of CRM 4.0, 2011, 2013, and now 2015 Microsoft Dynamics CRM has been increasingly positioning itself as a strong and reliable development framework; not much that cannot be done to make Microsoft Dynamics CRM fit very specific user experiences or business flows.

Make sure that the customizations do not work "against" CRM's philosophy. Respecting CRM's paradigm will assure that the extensions will not interfere with CRM core processes and will guarantee a clear path to future enhancements of the product.

Web Design

Your website will be a respresentation of yourself or your business in the World Wide Web.
In order for this representation to be positive, you must make sure to keep the site's content up-to-date. For example, the prices listed, the items or services offered, special offers, along with address, phone numbers, and contact information, must be current. The image projected by your website can suddenly turn negative if the visitors see expired offers, outdated information, or unchanged content for a long period of time. Compare your website to a store window which - in order to remain attractive - must periodically be redecorated.

TYPES OF WEBSITES

Putting aside the different technologies available and the quality of the design, your investment may vary also depending on the type of website you wish to build.

  • Public website
    • Static Content; few changes that a designer would make on your request from time to time.
    • CMS (Content Management System) allowing you to update the content of your website and in some cases, with information being automatically retrieved from a database.
  • Enterprise website
    • Static Content; few changes that a designer would make on your request from time to time.
    • CMS or Content Management System allowing you to update the content of your website and in some cases, with information being automatically retrieved from a database.
Poor data quality costing companies millions of dollars annually

Jeff Kelly, News Editor
Published: 25 Aug 2009

While the use of data quality software has hit an all-time high, companies, by their own admission, are still losing boatloads of money because of inaccurate data, according to a recent survey.

The average organization surveyed by Gartner said it loses $8.2 million annually through poor data quality. Further, of the 140 companies surveyed, 22% estimated their annual losses resulting from bad data at $20 million. Four percent put that figure as high as an astounding $100 million.

Much of this loss is due to lost productivity among workers who, realizing their data is incorrect, are forced to compensate for the inaccuracies or create workarounds when using both operational and analytic applications, Ted Friedman, an analyst with the Stamford, Conn.-based research firm, said in an interview.

Still, losses could be even higher were it not for the increasing adoption of data quality tools. According to Gartner, the data quality tools market grew by 26% in 2008, to $425 million.

Of those companies that use data quality tools, the survey found, many have begun deploying them to support projects other than business intelligence (BI) and data warehousing (DW), previously the two most common data quality use cases.

"The tools are not cheap, so people are doing the right thing by finding multiple ways to use them," Friedman said.

Specifically, around 50% of survey respondents said they are using data quality tools to support master data management (MDM) initiatives, and more than 40% are using data quality technologies to assist in systems and data migration projects.

As the survey indicates, however, most companies still have a long way to go to achieve comprehensive data quality processes. A common shortcoming, Friedman said, is that most data quality tools are difficult for non-power users to understand and consequently are used by only a small group of workers, usually IT staff.

According to the survey, only between one and five workers regularly interact with data quality tools at 58% of organizations. Another 22% said between six and 10 workers use data quality tools.

To improve data quality throughout the organization, Friedman said, vendors must make data quality tools simpler to use so that business types can use them and begin taking responsibility for the quality of their own data.

"In particular, providing data profiling and visualization functionality (reporting and dashboarding of data quality metrics and exceptions) to a broader set of business users would increase awareness of data quality issues and facilitate data stewardship activities," Friedman wrote in an accompanying report.

"Directly engaging users in specifying and maintaining business rules for cleansing, matching and monitoring would also aid a shift in culture toward the business having responsibility and accountability for properly managing data," he wrote.

Friedman also said organizations are increasingly applying data quality to data domains other than customer data, but more still needs to be done. He said the quality of financial data in particular costs some companies considerable money in the form of fines for incorrect regulatory filings.

Companies should also invest in technology that applies data quality rules to data at the point of capture or creation, he said, not just "downstream," as when loading data into a data warehouse.

According to the survey, less than half of respondents currently use data quality tools at the point of capture or creation, which often happens in operational systems, like CRM software.

"Historically, data quality tools have been most often used in an offline, batch mode -- cleansing data at a point in time outside the boundaries of operational applications and processes," Friedman wrote. "Gartner advises clients to consider pervasive data quality controls throughout their infrastructure, ensuring conformance of data to quality rules at the point of capture and maintenance, as well as downstream."

Fax: (815) 839-7283
Customer Service: admin@xtrabyteinc.com
Technical Support: support@xtrabyteinc.com
    

2024 © xtraByte Inc.