Let’s jump right in, hiring and interviewing takes on many forms with various philosophies in play, Tim Graettinger has developed a philosophy based on the needs of an outdated headhunter, reviewing an outdated HR manual. Perhaps, if today’s headhunters would consider today’s date, this is the digital age where a mobile philosophy focused on, who can use the tools of data mining, just might establish a positive interviewing process which gets better results. The past hiring structure is interfering with today’s high tech business needs.
On the other hand, to effectively communicate expectations and build confidence, there must be a mutual dependence between HR and the candidate. It would be more productive to open an interview with a real-life business requirements question, which challenges the candidate to create a data model, provided that there is a laptop loaded with RStudio, RapidMiner, KNIME, Microsoft BI and Excel on hand. To generate synergy, the excited candidate is given a time limit to produce the result by clustering data loaded from a csv file. There is no need for a stiff, shake down, criminal investigation, masquerading as an interview to find the right person for the job. Therefore, if the candidate can produce a result within the immediate laptop, data mining environment, then, that is all the proof which is needed for HR to make a hiring decision. Today’s industries are performance based, so the data mining interview must rely on hands-on performance criteria to meet the needs of employers.
From the article, it has been found that it is inconceivable to believe that a Graduate student, fresh out of college today, without any real world, data mining, project experience, will be chosen to interview by a Fortune 500 company for a data mining position. This power packed article reinforces the urgent need for all data mining prospects to develop hands on skills, using algorithms quickly as possible, if planning to compete in the data mining arena. Enterprises need people who can use open source data mining tools to cluster data, now! The information presented in the article, has turned a future problem into an opportunity.
Greater challenges are on the horizon, now is the time to master RStudio, RapidMiner, KNIME, Microsoft BI, Excel, and gain access to the online repositories for job security. Opportunity is waiting for the person who can see the light, digging through the dark tech tunnel, while running with a backpack full of the latest data mining tools towards success. Generally speaking, the personality of the data miner is of course, a major factor. The positive attitude, backed by confidence, will prevail!
To start, it is practical for the business owner to be educated in new developments surrounding Enterprise Data Management (EDM). I spent the past month researching EDM in part with the online education environment. Enterprise Data Management is the soul of modern business development in the digital age.
In effort to share what I discovered, let’s begin with my adventure into world of Big Data. Enterprise refers to the business process as one activity, for instance Live Nation Entertainment has a business process which includes activities from Ticketmaster, Live Nation Concerts, Front Line Management Group, and Live Nation Network. The source data from all the business activities must be collected, extracted, transformed, and loaded before the data can be usable for running reports or analysis. Although the business process varies for company to company, Enterprise Data Management is a necessary solution in today’s business environment.
During my intensive research, I discovered that the Data Management process must control and manage data with the assistance of technology support systems. Generally speaking, The ETL development starts out with the high-level plan, which is independent of any specific technology approach, (Kimball Group). The Data Warehouse Toolkit by Ralph Kimball, and Margy Ross is recommended for further reading. I discovered that there are many types of projects where having a well- defined approach to managing data is the key to success. For instance deciding what to put in the fact table. The ETL process for fact table incremental processing differs from the load, and doesn’t need to be fully automated. However, the facts must be entered after the business process has been decided.
Never the less, the ETL architecture must have the capability to implement slowly changing dimension (SCD) logic, (Kimball). The Kimball technical system architecture separates the data and processes comprising the DW/BI system into the backroom (extract), transformation and load (ETL) environment and the front room presentation area, with metadata processed throughout the operation. The metadata is all the information that defines and describes the structure, operation, and contents of the DW/BI system (Kimball).
As previously mentioned, Enterprise Data Management is a major concern for all business enterprises. Building a comprehensive strategy to maintain the DW/BI Lifecycle was conceived by the Kimball Group in the mid-1980s. “Regardless of your organization’s specific DW/BI objectives, we believe an overarching team goal should be business acceptance of the DW/BI deliverables to support the business’ decision making” (Kimball). Today’s Big Data is arriving from a multitude of sources. The company must invest resources and time in order to take advantage of the business opportunities which are waiting to be discovered with BI tools for analysis.