Your Data Science Idea is Great, But Is the Market Ready? Why Marketing is Key to Data Science Success

Collecting data

In the digital age, two seriously conflicting interests are rapidly growing – the desire to push technology to its limits and create a world filled with innovative devices and programs and the fear of consumers who are worried about the dangers of a world overrun with advanced technology.

For this very reason, President and CEO of Excellent Management Systems Inc. John Weathington reminded tech entrepreneurs, especially those in the data science field, to make sure the market is actually ready for their ideas before they launch them.

Some consumers fear that crazy sci-fi movies may not be as far fetched as they once seemed and others hate the idea of their favorite apps, games, search engines and devices secretly keeping tabs on their every digital move.

Whatever the reason, data science leaves many consumers feeling spooked and uncomfortable.

For example, Facebook users subjected the tech giant to serious backlash after discovering that the site was manipulating thousands of news feeds and user behavior in order to collect data for a study.

As Weathington pointed out, this certainly isn’t something new, but Facebook was still made out to be a monster for taking part in a practice that is much older than the website’s existence.

“To data scientists, it may seem perfectly normal to mine through digital behavior to understand and ultimately influence future behavior,” Weathington wrote on TechRepublic.com. “Marketing groups have been formally and publicly influencing behavior for decades, so why are Facebook’s data scientists any different?”

It’s simply a different method and a different market—that’s the key.

Even the greatest of technological advances will flop if the market simply isn’t ready for or comfortable with it just yet.

“Innovation with data science is exciting, but it can be risky if your market isn’t ready for your next great idea,” Weathington added. “Work closely with your marketing department to understand not only if, but when your next brilliant analytic offering will be a big hit.”

Marketing specialists are able to conduct thorough research and see if the market’s current consumers would be open to your latest technological innovation. The problem is that some data scientists forget just how important the marketing team really is.

Opting out of bringing marketing specialists on the team is a huge mistake that many data scientists tend to make.

Instead, some tech savvy CEOs will assign the marketing tasks to their product engineers.

The problem with the reassigning of roles is the fact that a product engineer will never be able to objectively look at something they have spent countless hours developing and view it as if they are just a random consumer.

Data scientists should also be open to introducing a much smaller concept to the market before eventually growing it under the watchful eye of consumers.

It’s a lot less threatening to watch something grow over time, something that has already been a part of your daily life and has revealed its many advantages, than to have a big, scary new way of collecting data forcefully thrust onto the market and scaring off people who are still feeling a little uncomfortable with technology’s rapid growth.

 

Search Algorithms Don’t Just Know You, They’re Judging You, Too

For avid technology users, the online experience has become an extremely personalized one. But the same algorithms that are responsible for recommending new products and completing search terms are not as objective as many users assume.

These algorithms can also be used to shape public opinion, support racial bias and even influence voting behaviors.

Tech giants like Facebook and Google are often conducting experiments on their users in order to learn more about their behaviors and how those behaviors can be influenced.

Back in June, Facebook was the target of online backlash after it was revealed that the social media giant conducted a research experiment by manipulating users’ news feeds.

The study was an attempt to see how the alteration of the news feeds would manipulate user emotions.

While Facebook did apologize for “any anxiety” the experiment may have caused, the test did not violate any of the social media site’s terms and conditions that users agree to before setting up their profiles.

Even more upsetting for some users was a study that attempted to see how Facebook could impact users’ willingness to vote.

The experiment proved successful, and the tech giant announced that it saw a drastic increase in civic engagement and voter turnout by incorporating an “I voted” badge on certain user’s profiles.

For some people it begged the question, if social media sites can influence some users to vote, could it also influence some users not to?

This is the same type of testing used by Google that tries to determine what types of color combinations and content placement will garner more attention from people online.

That ability to track behavior has also led to something called the “filter bubble,” which is the idea that the same search will produce very different results based on what type of person the search engine assumes you to be.

For example, the search for “wagner” on Google will likely produce sites about the composer Richard Wagner for women while men will see results about Wagner USA, which is a paint supply company.

Then there was the story of African-American Harvard University Ph.D., Latanya Sweeney.

Sweeney realized that her Google search results were often displaying advertisements asking if she had ever been to jail.

The same advertisements weren’t appearing for her white colleagues.

After conducting a study of the advertisements on different people’s Google results, it turned out that the algorithms behind the ad placements were likely to draw a connection between names commonly given to Black people and ads related to arrest records.

For once, Sweeney was confronted with the fact that some of these so-called objective algorithms are making connections based on stereotypes and racial bias.

The real concerns come from the fact that social media sites and search engines are not the only ones using such tools.

Earlier this year, a Hong Kong-based venture capital firm tasked an algorithm with making crucial decisions about which companies to invest in.

If such algorithms are continuously used to make investment decisions, is it possible that the same results that suggested Black people would want to know about arrest records will recommend wealthy investors avoid putting money into companies with Black CEOs or a certain percentage of Black employees?

While the algorithms don’t cause much harm when it comes to placing advertisements on Facebook pages, the implications of what these algorithms have the ability to do on a broader scale are enough to call for marginalized groups to keep a closer eye on what decisions these automated systems are allowed to make.