Understand Data Agility, our Data Science and Machine Learning service
In a data-driven economy, companies have to collect, store, and process all types of naturally created information to harness the hidden potential of all business generated data. Harnessing the power of data gives businesses the advantage in a competitive world. It is a mistake to assume, however, that turning insights into effective improvements is a simple process.
Data Agility is a service that combines Data Science with Agile and Lean to help companies from all sectors to develop digital products based on data. In the words of Fernando Ultremare, Dextra’s CTO, “Data Agility introduces a data culture to organizations – where analyzed data is the basis for planning decisions and processes”.
With a data culture, it is possible to solve some of the biggest pain points companies suffer. For instance, service companies can better understand dissatisfactions to win and keep customers. Other possibilities commonly explored by companies in all sectors are detecting inefficiencies, optimizing resource expenditures and cutting costs, or fraud patterns.
How it works
The Agility process has two phases. The first one is a Data Analytics Sprint, an immersion process that seeks to find whether the results expected by the hiring company are achievable.
“At the beginning, we don’t know if the data allows us to do the work requested and achieve the result. If the answer is negative, we understand the reason and explain why. If it is positive, we have a clearer direction”, explains Everton Gago, Head of Data Science at Dextra.
Before starting the next phase, even if the project is viable, it is necessary to carry out a data assessment – to structure the accumulated and disorganized Big Data (data lake) in an agile and incremental way. The assessment shows how they are stored and what needs to be done to organize the business and bring constant results to it.
The second phase is the Data Sprint. The data is used for an interaction sprint that seeks to solve “small pieces” of the problem, pieces that rapidly deliver tangible value to the business. These iterations continue until the expected results are met.
Key tools include:
- Cloud computing for data storage and processing, APIs, and resources (these have to be abundant);
- Big Data frameworks, such as Apache Spark and Hadoop, used by data scientists;
- BI tools for structuring views, such as Data Studio, Tableau, and Power BI, among others, where the users will see their reports;
- Automation tools based on Machine Learning, such as TensorFlow and PyTorch.
Partnerships are key to success and CINQ works with all major players to make sure it has access to all the tools required. Some of CINQ’s partners are Amazon Web Services (AWS), Google, and Cloudera.
To learn more, access our new offer Data Agility and check out all the details and processes of this service. Do you want to talk about a project? Just send us a message and we will get in touch with you. Contact us!