7-Questiona

Q1. Define the term velocity? And how it is measured?

Velocity is one of the five “V”’s defined for Big Data. Velocity can be defined as the speed with which quantum of data is being generated during a given period of time, and the speed at which the data moves around. In many business transactions and processing, the speed of data creation is very essential along with the volume of data generated. Data which is real-time or being close to real-time provides leverage to the organizations in being much more agile compared to their competitors. For instance the data of buying and selling of Shares in Stock exchanges, that happen every second and the speed at which the entire transfer taking place can make the share broking agencies.

Data Velocity is usually measured in terms of calculating how quickly the data is getting transacted proportionate the volume of data that is being transacted. Usually the memory analytics forms the basis of the techniques and technology of measuring the data velocity and currently there are many software application tools that support in analyzing the data effectively.

 

Q2. Define the term Burn down Chart?  Why is a Burn down Chart useful?

Burn down chart is a term which is often used in the agile software development and can be defined as system of visual representation which indicates the amount of work and effort that is required to complete before the end of project.   The visual representation of Burn down chart is in the form of graphs with “Work” indicated at Y axis and the “Time” indicated at X-axis. The representation is usually downtrend representation, from the amount of work still remaining to do over time before the “Burns Down” to Zero.

Usually the starting point of the burndown chart is developed during the iteration planning, where all the team members provide their estimates for their assigned tasks and the sum up all the tasks is considered as starting point and as per the daily recordings of the task completion the burndown value is calculated for the day and the estimated effort required for the whole project completion.

The major advantage of using the Burn’s down chart is that the team has a fair idea of the expected time duration and the effort that is required to ensure that the work is completed within the required time and accordingly manage the resource capacity or the efforts that is required. For project managers who are working complex projects that are being time bound, it is very resourceful method to analyze and understand the estimated amount of efforts that is required and can plan the attributes accordingly.

Q3. Please define the unique   characteristics of BIG DATA

Big Data is being very reverent in the industry trends, and there are many companies which are aiming at leveraging the potential uses of Big Data. When we look in to the key factors and the characteristics of big data the five “V’s”, Volume, Variety, Velocity, Value and Veracity estimations holds the key to the Big Data analysis. In overall this five V’s provide the following key characteristics for the Big Data.

Volume: Is critically essential component of data analysis and volume has been one of the major reasons why many companies have encountered the problem with RDBMS data warehouses which were prevalent. But today with the advent of big data, scores of data can be managed in a real time environment by the organizations.

Variety: The big data system facilitates the complex data analysis, where data in multiple formats can be easily captured for analysis from different sources like the documents, emails, text messages, videos and any kind of format and effective features to align the data towards analysis is the key attribute.

Velocity: The analysis of data on a real time quantum defines the huge change that is required by the organizations. The speed at which the data is getting developed and the speed at which the data is getting exchanged if analyzed in effective terms could provide quality insights to the organizations.

Value: Big Data is considered as to be significant value to the organization, due to the intrinsic value that is perceived to the data analysis. The kind of data that is stored in multiple formats and varieties are processed quick enough to provide effective decision making.

Veracity indicates the provenance of the data, the source of reliability, accuracy and the extension

 

Q4.      Please define the following term and how it applies to Data Warehousing and Business Intelligence

 

Metadata in terms for the data warehouse can be considered as the control panel that could be useful for data warehousing and the business intelligence system for reports, cubes, tables, columns, keys and indexes.

Metadata is very resourceful in terms of handling the data and provides rules, transformations, mappings and aggregations that are required for the data warehousing of huge quantum of data.  The power of metadata is that it leverages the scope for the data warehousing team to develop and control the system in a simple manner rather than using scores of programming codes, and in turn this saves significant cost and money to the process of data warehousing.

When there is combination of tools, process and people it is very effective to handle the metadata system for data warehousing and the business intelligence systems. It is essential that the right kind of tools are used, and the processes are established to ensure that business intelligence and data warehousing life cycle is carried out effectively, and the resources should be trained effectively on the meta data management. There are many effective tools that are available for organizations to use the Meta data system for data warehousing and business intelligence

 

Q5.      Please define the following term and how it applies to Data Warehousing and Business Intelligence

 

User Stories are the critical inputs from the stakeholders of the data warehousing and business intelligence team. It is very common phenomenon in the agile software development that initially rather than focusing on the requirements, the user stories are given importance as it provides us with the inputs which are imperative to the requirements and the challenges pertaining to the data warehousing and business intelligence systems.

If the right kind of approach is carried out and on the data warehousing and BI based user stories are effectively developed then it shall provide a complete outlook on the customer value, simplification of data that is required, data  boundaries and the effective techniques that could help in strong data warehousing and business intelligence systems.

It is very essential that if these User stories are collected, it provides more conventional inputs to the developers and the users for the agile framework development. As most of the requirements would have been met in the collected User Stories, it would be easy for the team to identify, correlate and provide right kind of inputs for new system development.

 

Q6. Please define the following term and how it applies to Data Warehousing and Business Intelligence

Fourth Normal Form usually represented as 4NF is one of the normal forms which are used in the database normalization. Fourth Normal Form is the next level of normalization that is developed after the BCNF model. In the first, second and the BCNF form of normalizations, the emphasis is more on the functional dependencies, and in the case of 4NF, the focus is more on the generic type of dependencies which can be considered as multi value dependency.

The 4NF is very critical for the data warehousing and the business intelligence systems, as when the organization has scores of data that has to be indexed, maintained and retrieved for data analysis based on the requirements of the outputs, it is imperative that the multivalued dependency could be very effective. By using the 4NF method of data normalization techniques, the data systems complexity for analysis gets reduced to a greater extent and the companies can focus on building the data effectively in terms of having optimum results.

 

In the case of data profiling, the requirements collection and the systems study is very essential at it familiarize the data that will be loaded in to the data warehousing system and the 4NF forms could be very resourceful to identify the potential independent values and the multivalued dependencies.

 

Q7.  Define the term WBS and explain its application to DW/BI.

A Work Breakdown Structure (WBS) is one of the critical components of Project management. It is a clearly defined chart which outlines the scope of the project, boundaries of the project by classification and defining of the project in various phases and providing information which could help in identifying the resources required to complete each phase and task. And in a combined outlook it shall turn to be a project resource list too.

Even in the case of data warehousing and the BI applications, it should be considered as a project management scenario, as the purpose of developing a data warehousing structure or the BI systems has a defined objective of getting some specific kind of inputs from the system. Also it is very essential that irrespective of the kind of framework that will be adapted for the development of the application, there is need for structured approach which includes defining the scope, the phases of implementation as to at what stage what kind of data integration to the data warehouse should take place, and who should be coordinating the data management.

When WBS is integrated to the project planning of data warehousing and business intelligence systems development, it could be effective for the organization to have an effective outcome of implementing the data warehouse and the BI systems.

 

 

 

 

 

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *