Right in front of the Washington Monument lies a little-known treasure concealed beneath a manhole cover: a 12-foot tall miniature replica of the monument itself. This isn’t merely a homage to one of America’s Founding Fathers; it also serves a functional role in measuring the Earth. This small monolith is a geodetic control point, part of a network of approximately 1.5 million markers throughout the U.S. utilized by NOAA's National Geodetic Survey (NGS). These points are vital for mapmakers, engineers, surveyors, and project managers to ascertain locations and elevations.
When this marker was created in the 1880s, its designers chose to model it after the nearby Washington Monument. Typically, geodetic control points consist of simple metal poles embedded in concrete, topped with a bronze disc that indicates the type of mark (such as azimuth, bench, or gravity marks). Some markers also commemorate significant events—like one at the University of Utah that honors the 2002 Winter Olympics while serving as a control point. Even Walt Disney World has its own collection of geodetic markers scattered throughout its parks.
Although most people are unaware of its presence, the Washington Monument marker has been included in every NGS survey since the early 1900s and was recently utilized to reassess the area following the 2011 Virginia earthquake. While GPS technology has rendered many of these markers seem obsolete, they continue to play an essential role in assisting surveyors with positioning GPS antennas and verifying the accuracy of these modern systems.
Just as geodetic markers help us understand our physical world, data visualizations help us comprehend complex data. Both are tools for translating intricate systems into something comprehensible and actionable. However, where geodetic markers are static and precise, data visualizations are dynamic, requiring careful design to ensure accuracy and clarity. Poorly crafted visualizations can mislead, confuse, or overwhelm their audience—just as missing or inaccurate geodetic markers would disrupt navigation.
This raises a critical question: How can organizations create data visualizations that not only tell a compelling story but also ensure accuracy and comprehension for their intended audiences? It can be a challenging process, often leading to conflicts and disagreements within teams
In any team discussion about data visualizations, differing interpretations are inevitable. These debates can be productive, sharpening the clarity and focus of the visualization, but they can also derail progress if left unresolved. The key to navigating these conflicts lies in fostering alignment and collaboration while keeping the ultimate goal—the audience’s understanding—front and center.
Before diving into the specific points of contention, it’s essential to clarify the purpose of the visualization. Is the goal to highlight a trend, explain a cause, or drive a decision? When everyone agrees on what the visualization is supposed to achieve, interpretations are more likely to converge. This shared sense of purpose acts as a compass, guiding the team toward a common destination. For example, if the visualization aims to illustrate year-over-year growth, any interpretation that deviates from that focus can be gently redirected.
Confusion often arises from inconsistent metrics or definitions. If one team member views "revenue" as gross income while another assumes it means net, their interpretations will naturally conflict. To address this, ensure that all data points are defined and standardized before the review process begins. Agreeing on these foundational elements beforehand eliminates unnecessary misunderstandings and keeps the discussion focused on higher-level insights rather than nitpicking over terminology.
Conflict resolution requires open, respectful communication. During the review, invite all team members to share their perspectives and explain their reasoning. Sometimes, what seems like a disagreement is actually a miscommunication, or it may reveal a valid insight that enhances the final product. Facilitating a collaborative dialogue fosters trust, allowing team members to feel heard and valued. A good way to structure this is to first let everyone state their views without interruption, then discuss the merits of each interpretation collectively.
When conflicts persist, refocus the discussion on the audience’s perspective. Ask, “How would our intended audience interpret this visualization?” This shifts the focus away from personal preferences or internal technicalities and toward creating a clear, compelling message for the end user. If the visualization must resonate with executives, for instance, simplicity and clarity might outweigh the inclusion of every possible detail.
In one of my own projects, a visualization review sparked heated debates about how to represent data trends. The team was split between a detailed approach, which showcased every nuance, and a more streamlined presentation that focused on the overarching message. By returning to our objectives and our audience’s needs, we crafted a solution that incorporated layered visuals—providing a high-level overview with optional drill-downs for those who wanted more detail. This compromise not only resolved the conflict but also improved the final product.
Conflicts over data interpretations aren’t just challenges to overcome; they’re opportunities to deepen understanding, strengthen collaboration, and refine the message. When teams approach these debates with mutual respect, shared goals, and a commitment to the audience, the process becomes less about winning an argument and more about creating a visualization that truly resonates. By focusing on alignment, consistency, and audience perspective, you can turn a contentious review into a catalyst for producing your best work yet.
Having established how to navigate conflicts during the data visualization review process, we now turn our attention to the challenge of demystifying complex data visualizations. This involves striking a balance between simplicity and complexity, ensuring that the audience can easily interpret the data while still conveying the full depth of the information.
While conflicts during visualization reviews can be addressed through collaboration and alignment, it’s equally important to understand the root causes of poorly designed visualizations. According to a Harvard Business Review study, bad data visualization design often stems from three primary issues: lack of clarity, cognitive overload, and insufficient context. These issues create barriers to understanding and hinder decision-making.
Lack of Clarity: Visualizations that are overly complex or poorly labeled make it difficult for audiences to discern the key message. Clarity requires thoughtful design choices, such as selecting the right chart type, using intuitive color schemes, and ensuring labels are legible.
Cognitive Overload: When visualizations try to convey too much information at once, they overwhelm the audience. This is especially problematic when unnecessary details—such as excessive data points or decorative elements—distract from the core insight.
Insufficient Context: Data without context is just noise. Visualizations that fail to provide background information, such as trends over time or benchmarks for comparison, leave audiences guessing at the significance of the data.
Complex data visualizations often feel like a puzzle with too many pieces—overwhelming at first glance, yet rewarding once the picture comes into focus. Ensuring accurate interpretation goes beyond presenting data to include designing visuals that guide the audience through the complexity with clarity and intention. Whether you’re creating or interpreting these visualizations, the goal is to bridge the gap between intricacy and insight.
When faced with a complex visualization, the first step is understanding its purpose. What question is the visualization trying to answer, or what story is it aiming to tell? This initial focus sets the stage for interpretation. If the purpose isn’t immediately clear, it’s a sign the visualization may need refinement. For instance, a graph comparing multiple KPIs across different regions and timeframes should make its focal point (perhaps trends over time or regional disparities) evident at a glance.
Simplification doesn’t mean dumbing down the data; it means eliminating distractions that obscure the message. Extraneous gridlines, overly elaborate 3D effects, or excessive data points can dilute clarity. A visualization packed with every possible detail might impress on a technical level, but it risks losing the audience. Instead, focus on what truly matters; that is, things like highlighting patterns, outliers, or relationships that advance the narrative. One practical approach is to use filters or layers, allowing viewers to drill down into specifics only if they choose to.
The human brain craves order, and visual consistency makes data easier to digest. Using uniform scales, consistent color schemes, and a predictable layout fosters trust and reduces the cognitive load on your audience. For example, if one chart uses green to signify growth and another uses red for the same metric, confusion ensues. A well-thought-out color palette, applied consistently, can make even the most complex visualization feel approachable.
Even the best-designed visualizations benefit from annotations. These small yet powerful additions (such as labels, notes, or subtle markers) can guide the audience toward critical insights without making them hunt for meaning. Highlighting a significant peak, explaining a sudden drop, or labeling key milestones adds context that raw visuals alone might not convey. Annotations transform a static chart into a guided experience, ensuring the viewer’s interpretation aligns with the intended message.
Remember, what’s clear to you as the creator or analyst may not be obvious to your audience. Always step back and ask, “What assumptions am I making about my viewers’ familiarity with this data?” Tailoring the visualization to the audience’s level of expertise can mean the difference between clarity and confusion. For example, while a financial analyst might appreciate a candlestick chart's nuances, a general audience might benefit more from a line graph summarizing the same trends.
Complex data visualizations rarely reach perfection on the first attempt. Iteration is your ally. Share drafts with colleagues or stakeholders, solicit feedback, and be willing to revise based on their insights. Each iteration helps identify areas where clarity can be improved or where additional context is needed. This collaborative process not only enhances the visualization but also builds trust in its accuracy.
Mastering complex data visualizations is an exercise in balance—simplifying where necessary, contextualizing where helpful, and always focusing on the story the data tells. When done right, even the most intricate visualizations become powerful tools for insight and decision-making. By centering purpose, consistency, and audience understanding, you can transform overwhelming visuals into engaging narratives that resonate with your viewers and ensure accurate interpretation every time. In summary:
Start with Purpose: Define the visualization’s goal and ensure it answers a specific question or tells a clear story.
Simplify Without Oversimplifying: Eliminate distractions while retaining the depth of the data.
Leverage Technology for Interactive Visualizations: Use tools like Tableau or Power BI to create dashboards that allow users to explore data at their own pace.
Add Context Through Annotations: Highlight key insights with labels or notes to guide the audience.
Iterate and Refine: Share drafts, gather feedback, and improve the visualization through collaboration.
In conclusion, both geodetic markers and data visualizations serve as tools to help us understand our world, whether it's the physical landscape or complex data sets. Just as geodetic markers (like the one hidden near the Washington Monument) provide a reliable set of data points for determining location and elevation, effective data visualizations provide a clear, compelling narrative that guides the audience through complex data.
By addressing root causes like lack of clarity, cognitive overload, and insufficient context, and adopting practical solutions such as simplification, context, and iterative design, organizations can transform their visualizations into powerful tools for insight.
So, the next time your team faces conflict or uncertainty during the visualization design process, remember this: clarity, alignment, and purpose are the true geodetic markers of effective data storytelling. Will you create a visualization that guides your audience—or leave them lost in the data?
"when a tool is not a tool, driving decisions with data"
Dr. Joe Perez is a powerhouse in the IT and higher education worlds, with 40-plus years’ experience and a wealth of credentials to his name, having been featured on multiple Times Square billboards. As a former Business Intelligence Specialist at NC State University and currently a Senior Systems Specialist/Team Leader at the NC Department of Health & Human Services (and Chief Technology Officer at CogniMind), Perez has consistently stayed at the forefront of innovation and process improvement. With more than 18,000 LinkedIn followers and a worldwide reputation as an award-winning keynote speaker, data viz/analytics expert, talk show co-host, and Amazon best-selling author, Perez is a highly sought-after resource in his field. He speaks at dozens of conferences each year, reaching audiences in over 20 countries and has been inducted into several prestigious Thought Leader communities. When he’s not working, Dr. Joe shares his musical talents and gives back to his community through his involvement in his church’s Spanish and military ministries.