Facebook Pixel

When you are communicating your Data Science findings, don’t ditch the scientific mindset you’ve had for all the other parts of your project.

 

It’s not at all uncommon for We All Count to work with a project that has allocated tens of thousands of dollars toward hiring data collection firms, building analysis capability, or consulting methodological experts, yet expect the product of this work to be designed and communicated by their researchers armed with only Microsoft Word and a color printer. 

 

How you communicate your in formation is as important as the information itself, as understanding the content is dependent on its effective communication. Does this mean that you need to spend thousands of dollars on graphic design software, digital dashboards and hiring outside communication experts? No, just maintain a scientific approach to communication. 

Frame this step in the process as science, not just art. 

 

Too often this can feel like the moment in the process where ‘the science’ ends and how things get communicated become dependent on artistic considerations like ‘who’s the best writer on the team’ or an unscientific set of easy-to-follow guidelines like ‘best practice for this kind of data is a bar chart’. The first equity issue with communication is relying on a pre-established idea of what ‘good’ communication is and who is ‘good’ at it. You would never say: this science is good because so-and-so is a great scientist. You’d look at the method, test the results, repeat the experiment. 

 

By thinking of the communication stage as an area that can be measured, improved, and repeated, you’re already halfway done increasing the effectiveness and equity of your communication step. 

Remove yourself from the equation. 

 

If you’re running a medical trial, you should not also be a subject in that trial. This is obvious scientific best practice. You’re often too close to the subject matter to effectively communicate the information to someone who doesn’t already understand it. Of course, this presents an issue as you can hardly have your report written by someone who doesn’t understand the content. 

 

This is where you can do some controlling of your own perspective. Go ahead and write the report, or design the infographic, or produce the animation that feels clear and impactful to you. Then consider whatever audience it is intended for and look at each element from their perspective. Your vocabulary, cultural perspective, methodological expertise, visual lexicon, and exposure to similar mediums is unique to you. You might not be the best judge of what is ‘effective’ communication in each case. 

 

This isn’t just about improving your writing. Consider how inequitable it is to collect data from a large, vulnerable population, but then, after analysis, limiting the effective access to their own information with a medium and a message that is only easy to understand if you happen to be the person writing it.

Design a method that fits, don’t rely on standards.  

 

The best data science communication is tailored to the specifics of the project. For example, most of the ‘best practices’ in data visualization don’t hold up across cultures, education levels, and other demographics. The standard bar chart is not universally understood and typically works best for people who have been trained for a long time by being exposed to a lot of them. Your audience may or may not respond well to them. Don’t assume. 

 

Just like when choosing a methodology, best practices can be a helpful jumping-off point to benefit from the improvements that people have made in the past. However, it’s important to ask yourself ‘best practice for who?’. Is this standard the best way to communicate my specific content to my specific audience? Does it need tweaking? Does it need a complete overhaul?

 

How can we ever know what will be effective when we’ve removed ourselves as a judge and can’t rely on someone else’s standards? This is where the real science kicks in:

Test. 

 

Test. Your. Communication. 

 

For some reason, many researchers who love to research don’t test the effectiveness of their communication. They write a draft among themselves, take a look at it, write a final version and then distribute it untested. There are a lot of ways to test the effectiveness of your communication and see who it’s working for and who it’s not. The most basic way to test your communication is giving drafts to members of your target audience. If you care that they understand it, ask them if they do. You can even test the depth, impact and retention of your information more formally (but remember you’re testing to see if your materials work for this audience, not how good this audience is at understanding your materials). 

 

One simple way that We All Count tests any data visualization is called a Reverse-Engineered Legend. We give someone in our target audience a graph and ask them to tell us what each of the elements mean in their interpretation of the visualization. We ask them to identify the meaning of element shapes, colors, lines, patterns, axes, symbols, images, size, position, direction, etc. This simple test will reveal equity issues like cultural translation errors (‘to this audience this color has a different connotation’ or ‘reading time from left to right is confusing to this audience’, etc.), missed opportunities (‘there are no symbols, maybe we’re relying too much on text’ etc.), assumed knowledge (‘oh, I guess this kind of chart only makes sense to people trained to read it’), and elements that are generally unclear. 

 

The reverse-engineered legend is only one type of test you can do on your data communication and they can range from very informal to highly complex. If you find out that your communication isn’t as effective as you thought, what can you do? 

Iterate. 

 

Once again, a common part of the scientific process gets lost in the desire to quickly finish up projects. If you’re the kind of person who would run models over and over again to perfect them, surely you would bring the same spirit to improving each version of the key product of those models. 

Use the appropriate tools. 

 

All scientific progress walks lockstep with technological advancement. Take advantage of things like graphic design software, digital dashboards, interactivity, animation, and accessible databases when appropriate to get your message across more clearly and more completely. The difference between a default spreadsheet chart and a custom graphic can be like the difference between a spyglass and the Hubble Telescope. 

Value expertise.

 

In the science world, it can feel like expertise is under attack. People who have dedicated a lot of time to develop mastery over any component of your project are worth consulting. They can offer techniques and solutions you couldn’t even imagine much less execute. We think writers and graphic designers offer just as valuable expertise as your methodological expert or data analyst. 

 

However, don’t fall into the equity trap of reinforcing ‘traditional’ experts’ undue power in your project and ignoring the expertise of people you might not think of as (or who might not call themselves) experts. Almost always, the people with the most expertise at refining the storytelling, cultural translation, language, and design aesthetics for your audience are members of your audience.  

Incorporate feedback. 

 

The scientific method depends on a loop. I do an experiment, I share my results, you critique or replicate my experiment, I do my experiment again, but better, repeat, repeat, repeat. If you are communicating your results in a medium with no mechanism for feedback, you’re cutting this cycle short. Things as simple as a comments section are a start but you can improve equity further with open data, exploratory platforms, genuine community engagement and control, and most importantly not planning for the first publication of your project to be the end of your project, but one of multiple rounds of revised communication. 

Show your work. 

 

Just as we would encourage you to be open about how you collected data, or what methodology you used, we encourage you to be transparent about why you are communicating your information the way you are. Add a paragraph about how you chose the medium, how you designed the graphics, and what audience this particular communication is aimed at. This builds trust with audiences, helps you define exactly who you are communicating with, and also makes people feel better if they are reading something that isn’t intended for them. They may ask where the communication for them is, or critique what was made for them, but those are valid complaints that can help you improve the equity of your project. 

 

Approaching communication and distribution as another important part of your data science will always boost equity. Assumptions, power structures, and laziness get revealed and corrected by using a systematic, scientific mindset. How you get your information out there is an integral part of the equitable success (or just plain success) of your project.