Blogs
Because sharing is caring
Developing digital products in China with Bianca Grizhar
In this episode, we talk to Bianca Grizhar about developing digital products in China. We discuss the cultural differences between Germany and China, and how Scrum improved team collaboration. We talk about the benefits of working directly with people offshore in cross functional, cross organisational teams. Whether you’re a product manager or a developer, this episode is packed with practical advice that will help you work more successfully with your offshore teams.
Dysfunction Mapping with Michael Lloyd
Join Murray Robinson and Shane Gibson as they chat with Michael Lloyd about dysfunction mapping.
Whether you are new to the agile world or a seasoned professional, you’ll find valuable insights as we delve into understanding team dynamics, honing the coaching process. And the essential principle of iterative hypothesis testing. Join us as we learned the power of mapping dysfunctions to unveil powerful solutions in agile environments.
Data Storytelling with Kat Greenbrook
Explore the Art of Data Storytelling with Kat Greenbrook on the Agile Data Podcast. Dive into Kat’s transformative journey from aspiring vet to a data storytelling expert, and discover the power of the ABT (And, But, Therefore) narrative framework in conveying compelling data insights. Uncover common pitfalls and learn crucial differences between data visualization and storytelling. Enhance your business communication skills with practical tips and insights from ‘The Data Storyteller’s Handbook.’ Perfect for professionals in data analytics, business intelligence, and anyone keen to master the art of turning data into impactful stories.
Knowledge Graphs with Juan Sequeda
Dive deep into the world of knowledge graphs with Juan Sequeda on the Agile Data Podcast, hosted by Shane Gibson. Explore key insights from Juan’s journey in computer science, his pivotal role in semantic web development, and the transformative power of knowledge graphs in data integration. Discover how these technologies are reshaping the landscape of data management and the exciting future prospects with the advent of Large Language Models (LLMs). Tune in to understand the practical applications, challenges, and the future of knowledge graphs in enterprise data strategy.
Ways of Working with Scott Ambler
Join Shane Gibson on the Agile Data Podcast for an enlightening conversation with Scott Ambler, an IT and Agile expert. Delve into Scott’s journey from pioneering programmer to data architecture and Agile methodologies. Discover the evolution of Agile data, the importance of adapting ways of working, and the pitfalls of best practices. Learn valuable insights into continuous improvement, team dynamics, and the complexities of data quality in today’s fast-paced IT landscape. Don’t miss this episode for an in-depth exploration of Agile data and its impact on IT projects and processes.
Mobius loop with Gabrielle Benefield
In this episode, we talked with with Gabrielle Benefield, the inventor of the Mobius loop, an outcome focused product development framework. Gabrielle shares her journey in agile product development from Silicon valley to Yahoo. And the importance of focusing on outcomes of outputs and improving decision speed. We discuss why organisations don’t focus on outcomes, the problem with contracts and scaling an outcome based approach. And finally, we talk about how you can use an experimental approach to bring an outcome driven, focus to your organisation.
Tesla and SpaceX with Joe Justice
In this episode, we talked to Joe Justice about how Tesla and Space X have developed a new operating model based on self-organizing teams that continuously discover, deliver, and improve to achieve a thousand year goal. It’s like the most radical combination of open space agility, OKR’s, Continuous Discovery, Continuous Delivery and Continuous Improvement, you can think of. Whatever you think of Elon Musk, these companies are way ahead of most other companies and accelerating further. Tune in for a fascinating discussion about how Tesla and Space X work.
Sprint Goals with Maarten Dalmijn
In this episode, Murray Robinson chats with Maarten Dalmijn about Sprint Goals. We discuss how sprint goals should act as a primary mission for a sprint, around which everyone’s tasks should revolve, rather than focusing on individual deliverables. We talk about challenges to implementing this approach, the importance of measuring outcomes over output, and the need for competency and trust in the team. This conversation highlights the importance open dialogue with stakeholders, and the necessity of adaptability in the face of uncertainty.
The Art of Action with Stephen Bungay
Join Murray Robinson, Shane Gibson, and special guest Stephen Bungay to discuss how military principles like “Mission Command” can be applied to manage teams effectively in uncertain conditions.
AgileData App
Explore AgileData features, updates, and tips
Network
Learn about consulting practises and good patterns for data focused consultancies
DataOps
Learn from our DataOps expertise, covering essential concepts, patterns, and tools
Data and Analytics
Unlock the power of data and analytics with expert guidance
Google Cloud
Imparting knowledge on Google Cloud's capabilities and its role in data-driven workflows
Journey
Explore real-life stories of our challenges, and lessons learned
Product Management
Enrich your product management skills with practical patterns
What Is
Describing data and analytics concepts, terms, and technologies to enable better understanding
Resources
Valuable resources to support your growth in the agile, and data and analytics domains
AgileData Podcast
Discussing combining agile, product and data patterns.
No Nonsense Agile Podcast
Discussing agile and product ways of working.
App Videos
Explore videos to better understand the AgileData App's features and capabilities.
Upgrading Python: A Plumbing Adventure in the Google Stack
In the ever-evolving world of AgileData DataOps, it was time to upgrade the Python version that powers the AgileData Platform.
We utilise micro-services patterns throughout the AgileData Platform and a bunch of Google Cloud Services. The upgrade could have gone well, or caused no end of problems.
Read more on our exciting plumbing journey.
AgileData App UX Capability Maturity Model
Reducing the complexity and effort to manage data is at the core of what we do. We love bringing magical UX to the data domain as we do this.
Every time we add a new capability or feature to the AgileData App or AgileData Platform, we think how could we just remove the need for a Data Magician to do that task at all?
That magic is not always possible in the first, or even the third iteration of those features.
Our AgileData App UX Capability Maturity Model helps us to keep that “magic sorting hat” goal at the top of our mind, every time we add a new thing.
This post outlines what that maturity model is and how we apply it.
Unveiling the Magic of Change Data Collection Patterns: Exploring Full Snapshot, Delta, CDC, and Event-Based Approaches
Change data collection patterns are like magical lenses that allow you to track data changes. The full snapshot pattern captures complete data at specific intervals for historical analysis. The delta pattern records only changes between snapshots to save storage. CDC captures real-time changes for data integration and synchronization. The event-based pattern tracks data changes triggered by specific events. Each pattern has unique benefits and use cases. Choose the right approach based on your data needs and become a data magician who stays up-to-date with real-time data insights!
The challenge of parsing files from the wild
In this instalment of the AgileData DataOps series, we’re exploring how we handle the challenges of parsing files from the wild. To ensure clean and well-structured data, each file goes through several checks and processes, similar to a water treatment plant. These steps include checking for previously seen files, looking for matching schema files, queuing the file, and parsing it. If a file fails to load, we have procedures in place to retry loading or notify errors for later resolution. This rigorous data processing ensures smooth and efficient data flow.
The Magic of Customer Segmentation: Unlocking Personalised Experiences for Customers
Customer segmentation is the magical process of dividing your customers into distinct groups based on their characteristics, preferences, and needs. By understanding these segments, you can tailor your marketing strategies, optimize resource allocation, and maximize customer lifetime value. To unleash your customer segmentation magic, define your objectives, gather and analyze relevant data, identify key criteria, create distinct segments, profile each segment, tailor your strategies, and continuously evaluate and refine. Embrace the power of customer segmentation and create personalised experiences that enchant your customers and drive business success.
Fast Answers at Your Fingertips: Unveiling AgileData’s ‘Ask a Quick Question’ Feature
Immerse yourself in the magical world of data with AgileData’s ‘Ask a Quick Question’ capability. Perfectly designed for data analysts and business analysts who need to swiftly extract insights from data, this capability facilitates quick data queries and rapid exploratory data analysis.
The Hitchhikers guide to the Information Product Canvas
TD:LR In mid 2023 I was lucky enough to present at The Knowledge Gap on the Information Product Canvas. Watch The Information Product Canvas, is an innovative pattern designed to capture data requirements visually and...
Magical plumbing for effective change dates
We discuss how to handle change data in a hands-off filedrop process. We use the ingestion timestamp as a simple proxy for the effective date of each record, allowing us to version each day’s data. For files with multiple change records, we scan all columns to identify and rank potential effective date columns. We then pass this information to an automated rule, ensuring it gets applied as we load the data. This process enables us to efficiently handle change data, track data flow, and manage multiple changes in an automated way.
Unveiling the Secrets of Data Quality Metrics for Data Magicians: Ensuring Data Warehouse Excellence
Data quality metrics are crucial indicators in a data warehouse that measure the accuracy, completeness, consistency, timeliness, and uniqueness of data. These metrics help organisations ensure their data is reliable and fit for use, thus driving effective decision-making and analytics
Amplifying Your Data’s Value with Business Context
The AgileData Context feature enhances data understanding, facilitates effective decision-making, and preserves corporate knowledge by adding essential business context to data. This feature streamlines communication, improves data governance, and ultimately, maximises the value of your data, making it a powerful asset for your business.
New Google Cloud feature to Optimise BigQuery Costs
This blog explores AgileData’s use of Google Cloud, specifically its BigQuery service, for cost-effective data handling. As a bootstrapped startup, AgileData incorporates data storage and compute costs into its SaaS subscription, protecting customers from unexpected bills. We constantly seek ways to minimise costs, utilising new Google tools for cost-saving recommendations. We argue that the efficiency and value of Google Cloud make it a preferable choice over other cloud analytic database options.
Data as a First-Class Citizen: Empowering Data Magicians
Data as a first-class citizen recognizes the value and importance of data in decision-making. It empowers data magicians by integrating data into the decision-making process, ensuring accessibility and availability, prioritising data quality and governance, and fostering a data-centric mindset.
The patterns of Data Vault with Hans Hultgren
In a compelling episode of the Agile Data Podcast, Shane Gibson invites Hans Hultgren to explore the intricacies of Data Vault modeling. Hans, with a substantial background in IT and data warehousing, brings his expertise to the table, discussing the evolution and benefits of Data Vault modeling in today’s complex data landscapes. Shane and Hans navigate through the foundations of Data Vault, including its core components like hubs, satellites, and links, and delve into more advanced concepts like Same-As Links (SALs) and Hierarchical Links (Hals). They highlight how Data Vault enables flexibility, agility, and incremental development in data modeling, ensuring scalability and adaptability to change.
To whitelabel or not to whitelabel
Are you wrestling with the concept of whitelabelling your product? We at AgileData have been there. We discuss our journey through the decision-making process, where we grappled with the thought of our painstakingly crafted product being rebranded by another company.
Metadata-Driven Data Pipelines: The Secret Behind Data Magicians’ Greatest Tricks
Metadata-driven data pipelines are the secret behind seamless data flows, empowering data magicians to create adaptable, scalable, and evolving data management systems. Leveraging metadata, these pipelines are dynamic, flexible, and automated, allowing for easy handling of changing data sources, formats, and requirements without manual intervention.
The Enchanting World of Data Modeling: Conceptual, Logical, and Physical Spells Unraveled
Data modeling is a crucial process that involves creating shared understanding of data and its relationships. The three primary data model patterns are conceptual, logical, and physical. The conceptual data model provides a high-level overview of the data landscape, the logical data model delves deeper into data structures and relationships, and the physical data model translates the logical model into a database-specific schema. Understanding and effectively using these data models is essential for business analysts and data analysts, create efficient, well-organised data ecosystems.
Shane Gibson – Making Data Modeling Accessible
TD:LR Early in 2023 I was lucky enough to talk to Joe Reis on the Joe Reis Show to discuss how to make data modeling more accessible, why the world's moved past traditional data modeling and more. Listen to the episode...
AgileData Cost Comparison
AgileData reduces the cost of your data team and your data platform.
In this article we provide examples of those costs savings.
Cloud Analytics Databases: The Magical Realm for Data
Cloud Analytics Databases provide flexible, high-performance, cost-effective, and secure solution for storing and analysing large amounts of data. These databases promote collaboration and offer various choices, such as Snowflake, Google BigQuery, Amazon Redshift, and Azure Synapse Analytics, each with its unique features and ecosystem integrations.
Data Warehouse Technology Essentials: The Magical Components Every Data Magician Needs
The key components of a successful data warehouse technology capability include data sources, data integration, data storage, metadata, data marts, data query and reporting tools, data warehouse management, and data security.
Unveiling the Definition of Data Warehouses: Looking into Bill Inmon’s Magicians Top Hat
In a nutshell, a data warehouse, as defined by Bill Inmon, is a subject-oriented, integrated, time-variant, and non-volatile collection of data that supports decision-making processes. It helps data magicians, like business and data analysts, make better-informed decisions, save time, enhance collaboration, and improve business intelligence. To choose the right data warehouse technology, consider your data needs, budget, compatibility with existing tools, scalability, and real-world user experiences.






























