Do you know if your company’s data is clean and well managed? Why is this important anyway?

Without a working governance plan, you might not have to worry about a business – from a data perspective.

Data governance is a set of practices and processes establishing the rules, policies and procedures that ensure the accuracy, quality, reliability and security of data. It provides formal management of data assets within an organization.

Everyone in business understands the need to have and use clean data. But ensuring it’s clean and usable is a tall order, according to David Kolinek, vice president of product management at Ataccam.

This challenge is even greater when business users have to rely on limited technical resources. Often no one oversees data governance, or that person doesn’t fully understand how the data will be used and how to clean it up.

That’s where Ataccama comes in. The company’s mission is to provide a solution that even people without technical knowledge, such as SQL skills, can use to find the data they need, assess its quality, understand how to solve problems and determine if this data will serve their purposes.

“With Ataccama, business users don’t need to involve IT to manage, access and cleanse their data,” Kolinek told TechNewsWorld.

Keep users in mind

Ataccama was founded in 2007 and basically got started.

It all started as part of Adastra, a consulting company, which is still in business today. However, Ataccama focused on software rather than consulting. Management has therefore transformed this operation into a product company that addresses data quality issues.

Ataccama started with a basic approach – an engine that performed basic data cleansing and transformation. But it still required an expert user due to the configuration provided by the user.

“So we added a visual presentation of the steps that enable data transformation and things like cleaning. This made it a low-code platform since users could do the majority of the work just by using the app’s UI. But it was still a heavy client platform,” Kolinek explained.

The current version, however, is designed for a non-technical user. The software features a light client, an emphasis on automation, and an easy to use interface.

“But what really stands out is the user experience, which is built on the seamless integration we were able to achieve with the 13th version of our engine. It delivers robust performance tuned to perfection,” he said. he declares.

Dig deeper into data management issues

I asked Kolinek to discuss data governance and data quality issues in more detail. Here is our conversation.

TechNewsWorld: How does Ataccama’s concept of centralizing or consolidating data management differ from other cloud systems such as Microsoft, Salesforce, AWS, and Google Cloud?

David Kolinek: We are platform independent and do not target any specific technology. Microsoft and AWS have their own native solutions that work well, but only within their own infrastructure. Our portfolio is wide open to meet all use cases that need to be covered in any infrastructure.

Plus, we have data processing capabilities that not all cloud providers have. Metadata is useful for automated processing, generating more metadata, which in turn can be used for further analysis.

We have developed these two technologies in-house so that we can provide native integration. As a result, we can deliver a superior user experience and lots of automation.

How does this concept differ from the notion of data standardization?

David Kolinek
Vice President of Product Management,
Ataccam

Kolinek: Standardization is just one of the many things we do. Usually normalization can be easily automated, the same way we can automate data cleaning or enrichment. We also provide manual data correction when fixing certain issues, such as a missing social security number.

We cannot generate the SSN, but we could find a date of birth from other information. So standardization is no different. It’s a subset of things that improve quality. But for us, it’s not just about data standardization. It is a question of having good quality data so that the information can be correctly exploited.

How does Ataccama’s data management platform benefit users?

Kolinek: The user experience is really our biggest advantage, and the platform is great for managing multiple people. Businesses need to empower both business users and IT people to manage data. This requires a solution that allows business and IT to collaborate.

Another huge advantage of our platform is the strong synergy between data processing and metadata management it offers.

The majority of other data management providers only cover one of these areas. We also use machine learning and a rules-based approach and validation/normalization, which again are often not supported by other vendors.

Additionally, as we are technology independent, users can connect to many different technologies from the same platform. With edge processing, for example, you can configure something once in Ataccama ONE, and the platform will translate it for different platforms.

Does the Ataccama platform lock users in like proprietary software often does?

Kolinek: We have developed all the basic components of the platform ourselves. They are tightly integrated together. There’s been a huge wave of acquisitions lately in this space, with big vendors buying up smaller ones to fill the gaps. In some cases, you are not really buying and managing one platform, but several.

With Ataccama, you can purchase a single module, such as Data Quality/Normalization, and later expand others, such as Master Data Management (MDM). Everything works together seamlessly. Simply activate our modules according to your needs. This allows customers to start easily and grow when the time is right.

Why is a unified data platform so important in this process?

Kolinek: The biggest benefit of a unified platform is that companies aren’t looking for a one-off solution to solve a single problem, like data normalization. Everything is interconnected.

For example, to standardize, you need to validate the quality of the data, and for that, you first need to find and catalog it. If you encounter a problem, even though it may seem inconspicuous, it more than likely involves many other aspects of data management.

The beauty of a unified platform is that for most use cases you have a solution with native integration and can start using other modules.

What role do AI and ML play today in data governance, data quality, and master data management? How does this change the process?

Kolinek: Machine learning allows customers to be more proactive. Previously, you had to identify and report a problem. Someone should investigate what was wrong and see if there was anything wrong with the data. Next, you would create a data quality rule to prevent recurrence. It’s all reactive and based on something going down, being found, reported, and then fixed.

Again, ML allows you to be proactive. You give it training data instead of rules. The platform then detects pattern differences and identifies anomalies to alert you before you even realize there was a problem. This is not possible with a rules-based approach, and it is much easier to scale if you have huge amounts of data sources. The more data you have, the better the training and its accuracy.

Besides cost savings, what benefits can companies derive from consolidating their data repositories? For example, does it improve security, CX results, etc. ?

Kolinek: This improves safety and mitigates potential future leaks. For example, we had customers who stored data that no one was using. In many cases, they didn’t even know the data existed! Now they not only unify their tech stack, but they can also see all stored data.

Onboarding new people to the platform is also much easier with consolidated data. The more transparent the environment, the sooner people can use it and start earning value.

It’s not so much about saving money as it’s about leveraging all your data to drive competitive advantage and generate additional revenue. It provides data scientists with the means to create things that will drive the business forward.

What are the steps to adopting a data management platform?

Kolinek: Start with the initial analysis. Focus on the most important problems the business wants to solve and select the platform modules to address them. The definition of objectives is essential at this stage. What KPIs do you want to target? What level of identification do you want to achieve? These are questions you need to ask yourself.

Next, you need a champion to drive execution forward and identify key stakeholders who could drive the initiative. This requires extensive communications between the various stakeholders, so it’s essential that someone is focused on educating others about the benefits and helping teams get on board with the system. Next comes the implementation phase where you address the key issues identified in the analysis, followed by deployment.

Finally, think about the next set of issues that need to be resolved and, if necessary, activate additional modules in the platform to achieve those goals. The worst thing you can do is buy a tool and provide it, but don’t offer any service, training, or support. This will ensure that adoption will be low. Education, support and service are very important for the adoption phase.


Source link

Previous

Manufacturing Process, Business Plan, Raw Materials, Industry Trends, Machinery Needs 2027 - Syndicated Analytics - Carbon Valley Farmer and Miner

Next

FCA publishes its business plan for 2022/2023 and its strategy for the coming years - Financial services

Check Also