AI on Pi Day Highlights

We are excited to announce a new column in our Forecaster article where we’ll be showcasing our sessions! You’ll find them listed in chronological order, and as a bonus, we’ll be providing some complimentary gifts. To find them, just check the presentation links. Easy peasy!

Data Testing: Moving towards being proactive by Peter Flook

Data testing and why we should shift towards a proactive approach? By proactive, Anticipating and preventing issues rather than reacting to them after they occur. Often, we fall into a reactive mindset, assuming we’ve crafted flawless code, only to encounter production problems later. Reacting to these issues means waking up at odd hours and dealing with unforeseen consequences. Instead of accepting complexity as an excuse to avoid testing thoroughly, we should strive to mirror production environments as closely as possible to catch issues beforehand. This brings us to the question: how much testing is enough? Address this alongside exploring available tools and upcoming innovations in testing, which can streamline our development cycles. Reacting to production incidents becomes a routine, where we keep adding more alerts or observability tools, but remain fundamentally reactive. Imagine building a bridge with patches continually added after completion. Similarly, software development becomes a cycle of firefighting unless we prioritize proactive testing. Testing not only prevents production issues but also fosters trust with users and stakeholders. It’s essential for developer experience, creating a culture of high-quality products and clear expectations. Testing, however, comes with challenges. Complex data flows and maintenance costs often deter developers. Additionally, the intangible benefits of testing, like streamlined workflows and improved processes, are hard to quantify. So, how do we approach testing effectively? It starts with understanding what to test, prioritizing critical components, and ensuring communication of requirements and dependencies. Performance testing is crucial too, ensuring our systems can handle expected loads. There are four main testing strategies: unit tests, integration tests, user acceptance tests (UAT), and performance tests. Each plays a role in ensuring the reliability and scalability of our systems. The tools for executing these tests vary, from HTTP APIs like Postman to data quality libraries like DQ. Docker containers are popular for integration tests. However, the future of testing lies in innovations like data contracts, automated data generation, and AI-driven test systems. These innovations promise to streamline testing processes and improve reliability. In conclusion, testing is not just about catching bugs; it’s about building trust, improving developer experience, and ensuring high-quality products. By embracing proactive testing and leveraging cutting-edge tools, we can create more robust and reliable software. Watch the video.

We work on

Bitol is a Linux Foundation AI & Data Sandbox project licensed under the Apache 2.0 license. As of now, it defines an open standard for data contracts called Open Data Contract Standard.

Share and Stay Tuned

Become a Member if you want to be a part of the story.

Share Events and News you find interesting with us here! We will give it a shout on our new newsletter AIDA Forecaster!

For exciting updates and valuable insights, visit us at and on LinkedIn. Stay tuned for more!

Leave a Reply