Tips & Tricks

22 Data Experts Reveal the Top Techniques to Double the Effectiveness of Your Big Data Analysis Efforts

Today’s companies are generating — and making use of — data at unprecedented rates. But there are many companies who are faced with growing amounts of data yet aren’t making the best use of the data they’re gleaning from their customers and even from public data sources, whether because they lack adequate Big Data Analytics tools and techniques, are looking at the wrong data sets, or possibly asking the wrong questions.

More importantly, Big Data Analytics in today’s world means hiring the right team of data scientists, analysts, and other professionals who know their way around a data set and can carry out statistical analysis with ease. Getting the right team in place is just one facet of getting the most value from your data.

To find out what companies who want to improve Big Data Analysis should be focused on, we asked a panel of data experts, data scientists, and business intelligence professionals to answer the following question:

“What’s the #1 thing or technique companies can leverage today to double the effectiveness of their Big Data Analysis efforts?”

Find out what our experts had to say below.

Meet Our Panel of Data Experts, Data Scientists, and Business Intelligence Professionals:


Apryl DeLanceyApryl DeLancey

@social_age

Apryl DeLancey is the President and CEO of Social Age Media. Based in Los Angeles, she’s not just a data scientist; she’s a data enthusiast.

“The number one thing that companies can do today to double the effectiveness of their Big Data Analysis efforts is…”

To assure they have the right team in place. This means not only your expert programmers and statisticians, but making sure one or more of them can also gather deep insights from the data and make actionable recommendations. In other words, someone that understands not only the numbers, but the strategic implications.


Josh JenningsJosh Jennings

Josh Jennings is the Chief Information Officer for a hedge fund and also a co-founder and CEO of a data science focused startup company called Financial Intellect.

“The primary thing companies can do to double the effectiveness of their Big Data Analysis efforts is to…”

Engage an outside specialist. Often times it requires a fresh look from an outsider to come up with innovative ways to use the data. Employees and people that use the data daily may become myopic and suffer from tunnel vision. The problem is that there are a shortage of qualified people, and the qualified people are usually expensive. Firms should weigh the cost benefit of engaging a consultant and negotiate a fee based on performance.


Mikko JarvaMikko Jarva

@ComptelCorp

Mikko Jarva is the CTO, Intelligent Data at Comptel Corporation. Based out of the company’s Kuala Lumpur office, he started his career with Comptel in 2000 as a trainer and product specialist. Since then, he has held positions in product marketing, sales, technical sales, and business development before entering his current position at the company in 2015.

“In order to improve Big Data Analysis, companies should remember…”

Apps, social media, cloud, and the sharing economy are all elevating customer experience expectations. Operators are no longer just dealing with traditional mobile data, but also have to consider connected devices, which are changing the way that businesses need to react to expectations.

Operators’ strategies need to, therefore, be based on granular, dynamic and in-the-moment assessments of buyers’ contextual needs. For instance, the whole Big Data paradigm needs to shift to an approach in which data is refined and analyzed simultaneously and actions should be taken automatically. We are past the era of Big Data – now, it’s time for Intelligent, Fast Data.

This is real-time, enriched operational data, which can be immediately utilized for instant decision-making and action-triggering. Data has the most value in the moment it is captured, when intelligence can be immediately extracted from it. Applying capabilities to learn from and improve future action-taking based on patterns, predictions, and notifications based on this data can help operators ensure they don’t lose any valuable information and, consequently, revenue opportunities.


Michael LiMichael Li

@tianhuil

Michael Li is the founder of The Data Incubator, an 8-week fellowship, to help Ph.D.s and postdocs transition from academia into industry. Previously, he headed monetization data science at Foursquare and has worked at Google, Andreessen Horowitz, J.P. Morgan, and D.E. Shaw. He is a regular contributor to VentureBeat, The Next Web, and Harvard Business Review. Michael earned his Ph.D. at Princeton and was a Marshall Scholar in Cambridge.

“The most important technique to improve Big Data Analysis is…”

Hire the right kind of data scientist. But before they do that, they need to understand that there are actually two types of data scientists! Here’s the difference and the kinds of backgrounds and motivations an employer can expect to look for in each type of data
scientist.

Analytics for Humans

In the case of data scientists who produce analytics for humans, another human is the final decision maker and consumer of the analysis. This type of data scientist often has to deliver a report on her findings and answer questions like what groups are using a product or what factors are driving user growth and retention.

Though they may sift through the same data sets as their analytics-for-machines counterparts, this type of data scientist delivers the results of their models and predictions to another human, who makes business or product decisions based upon these recommendations. Often, that decision maker is not a data scientist, so the data scientist must be able to explain her results in a non-technical way, which introduces an additional layer of complexity to the job. The need to explain implies that the data scientist might deliberately choose more basic models over more accurate but overly complex ones. Data scientists also must be comfortable coming to higher-level conclusions – the “why” and “how” – that are a step removed from the raw data.

A typical background for this kind of role is that of a social or medical scientist (often at the Ph.D. level). They are trained to ask the deeper questions (the “how” and “why”), making them better suited to produce analytics for humans. They are often trained to employ “simple” models and convey the results to those without deep technical understanding, like management or sales. Data scientists with these sorts of backgrounds frequently thrive on the intellectual challenge of explaining a model to another human and drawing clarity from obscure data. They also love seeing the direct impact of decision making at their organization.

Analytics for Machines

The other major division of data scientist is those who produce analytics for machines. In this instance, the final decision maker and consumer of the analysis is a computer. These data scientists build highly complex models that ingest vast data sets and try to extract subtle signals using machine learning and sophisticated algorithms. They tend to work in areas like algorithmic trading, online content/advertising targeting, or personalized product recommendations, to name a few. Their digital models are established and then act on their own, making recommendations, choosing ads to display, or automatically trading in the stock market.

Data scientists who produce analytics for computers must have remarkably strong mathematical, computational, and statistical skills to construct models that can make quality predictions quickly. They can piece together an array of technical tricks in order to create sophisticated models that squeeze out the last drop of performance and typically operate with easily measurable, unambiguous metrics from management such as clicks, profits, and purchases. Their value lies in leveraging their technical virtuosity over millions of situations where even small gains aggregated across millions of users and trillions of events can lead to huge wins.

Data scientists who produce analytics for machines often have mathematics, natural science, or engineering backgrounds (again, often at the Ph.D. level) with the deep computational and mathematical knowledge necessary to do the high-powered work. They also have strong software engineering backgrounds that enable them to build robust large-scale systems to deploy their analyses. They thrive on the technical challenge of building these large-scale, complex systems.

Why the Distinction Matters

It’s rare to find someone who is well-suited for both roles, so employers would do well to figure out which role they need. An MIT-trained physicist hungry for a deep machine-learning challenge likely would not be the best fit for a role in which their models must be “simple” enough for management to understand. She also may not be as comfortable extrapolating the “why” and “how” from the data. Likewise, a Harvard-trained social scientist might be great for explaining and drawing deeper conclusions from data, but may not be as well suited to produce analytics for machines. If he lacks the necessary deep mathematical and computational skills, he may not be able to build the robust systems or may engineer simplistic models that fail to capture the data’s full value.

Understanding your data science team — what makes them tick, what drives them up the wall — is just as important to the success of a big data strategy as understanding your technology stack. It’s important to figure out what you really need from a data scientist so that you can determine which backgrounds and temperaments would be best suited to getting the job done.


Matt StevensonMatt Stevenson

@mercer

Matt Stevenson is a Partner and Leader of Mercer’s Workforce Sciences Institute who specializes in helping organizations analyze workforce data.

“There are two key techniques companies can leverage to improve Big Data Analysis…”

1) Create a single data model for use by analysts that can be produced by all data providers; this will allow data analysts to avoid having to hack data together and allow for more efficient data handling and error reduction.

2) Our golden rule: Don’t go and get any data unless you know precisely what decision that data will inform. Data gathering can be endless and requires discipline to avoid chasing rabbits down holes.


Sanjay ParthasarathySanjay Parthasarathy

@SanjayPat

Sanjay Parthasarathy is the founder and CEO of Indix. He is an engineer and artist who loves creating and building new businesses. Prior to Indix, he was at Microsoft for 19 years. His last role was as corporate VP of the Startup Business Accelerator, a division he created to focus on building startups for Microsoft.

“What companies need to do to double the effectiveness of their Big Data Analysis efforts is…”

Hire a company that can do this. Most companies do not have the time or means to sort, sift, and make sense of all the big data they’ve collected, nor do they have the intelligence to do so. They need experts to do the job, and that’s where companies like Indix come in. Specific APIs can be used for things like on demand searches, searching store/brands, exporting categories, searching suggestions, product offers, searching price history and more.


Nicole PrauseNicole Prause

@NicoleRPrause

Nicole Prause received her Ph.D. in Clinical Science with a concentration in statistics. She has worked as a statistician in academia and Data Scientist in industry for ten years and founded Liberos, LLC.

“The problem many companies have with Big Data Analysis is that…”

Business looks almost exclusively at descriptive statistics like averages, which is a huge mistake. They make market decisions based on what appears to be lines going up or down, when the trends really are just normal variance that do not represent any stable trend. This is why analysts are not enough: businesses need someone who values error bars and can perform higher-level analyses.


Mike DriscollMike Driscoll

@medriscoll

Mike Driscoll founded Metamarkets in 2010 after spending more than a decade developing data analytics solutions for online retail, life sciences, digital media, insurance, and banking. Prior to Metamarkets, Mike successfully founded and sold two companies: Dataspora, a life science analytics company, and CustomInk, an early pioneer in customized apparel. Mike holds an A.B. in Government from Harvard and a Ph.D. in Bioinformatics from Boston University.

“The most important thing that companies today can do to increase (double or more) the return on their Big Data investments is…”

Build self-serve business intelligence tools.

Data’s value scales directly to its accessibility. Unfortunately, many data warehouses (or, using the current term du jour, data lakes) are accessible only to a high priesthood of data scientists, analysts, or systems administrators.

Self-service business intelligence tools that make data easily and securely available, enabling anyone in an organization to click their way through and explore key financial or marketing metrics, visually and intuitively, have tremendous ROI. The rise of self-service data analytics is what’s driving the growth of a number of companies, from DOMO to Tableau, and the increasing emphasis firms like Salesforce are placing on self-serve BI
offerings (like Wave).


Mark KerznerMark Kerzner

@MarkKerzner

Mark is an experienced/hands-on Big Data architect and a co-founder of Elephant Scale. He has been developing software for over 20 years, and currently focuses on Hadoop, BigData, NoSQL, and Amazon Cloud Services. Mark has been doing Hadoop training for individuals and corporations; his classes are hands-on and draw heavily on his industry experience.

“The most important thing that I would recommend to companies to increase the effectiveness of their Big Data Analysis efforts is to start with…”

Establishing feedback with their current data analysts.

When dealing with Big Data, developers tend to put efficiency and architecture first. What they then miss is how well their system answers the business analytics questions. Therefore, they should put the simple tools like Hive (in Hadoop) or SparkSQL (in Spark) into the hands of business analysts. The analysts will know which questions they need to ask of the data. Then talk to them or record these queries, and you will have your system going in the right way. You will also get your business insight early (even if the queries are slow and awkward). Later, you will have the time to optimize performance.


John Mount, Ph.D.John Mount

@WinVectorLLC

John Mount is the author of “Practical Data Science with R” (Manning 2014) and principal consultant at Win-Vector, LLC (a San Francisco data science consultancy).

“What companies have to do to improve Big Data Analysis is…”

Treat it as a business task that happens to require experts.

The engineering required to manage Big Data is exciting and the mathematics of the machine learning methods used on this data is fascinating, but they should not be allowed to drive your Big Data project. Treat Big Data as a business project. First accept you are going to have to collect, store, and organize data. Otherwise it is impractical to even propose new projects. Then treat each potential additional analysis project in terms of ROI, risk, and maintenance cost. Insist on measurable pilot programs of limited scale before commissioning and making full scale, data-driven business changes.


Nathan WatsonNathan Watson

@CanWorkSmart

Nathan Watson is the President of Contemporary Analysis, specializing in the implementation of bug data and predictive analytics. Contemporary Analysis has been in the business for 8 years across multiple verticals and just finished its 300th customer.

“One of the best ways to improve Big Data Analysis is…”

We have always found that implementing predictive analytics and showing the business user how to do proactive maintenance or proactive marketing/sales leads to better data collection and better buy-in, and also prevents their server from becoming filled with unused data.


Matt BarneyMatt Barney

@Leaderamp

Dr. Matt Barney is an Organizational Psychologist and founder of LeaderAmp, an artificially intelligent platform for psychometrics, coaching, and journaling. He has authored five books on topics rating from psychometrics to Six Sigma and Leadership. He has also authored four patents, and recently co-authored the Adaptive Measurement and Assessment chapter in the forthcoming 2016 Annual Review of Organizational Psychology and Organizational Behavior.

“One technique commonly overlooked by data scientists is…”

Converting data into measurements. Raw data is typically not suitable for engineering-worthy measurement that we would expect from thermometers and rulers. This is especially true when the data are from people, in the form of rating scales or rank orders. Data may not represent the intentions of the scientist; they may be lumpy or censored. And some rotten apples can spoil the bunch, if not removed prior to using in analytic models.

The solution is to use an approach developed in World War II, called Rasch Measurement. Influenced by physical science measurement, it allows the data scientist to proactively ensure the data have a good chance of being objective, linear, and concatenatable prior to collection. After the fact, it gives quality control methods to identify surprises and deviant data points that must be removed to avoid distorted information. Importantly, Rasch is a family of methods that can even adjust for biases in ratings, such as severity/leniency in judge ratings. When data scientists apply the Rasch techniques, they can achieve levels of rigor in accuracy and precision commonplace in engineering, biology, and physics. This is crucial to avoid subsequent predictive models from being distorted from bad instrumentation.


Andrew OsborneAndrew Osborne

@clarke_inc

Andrew Osborne works as Clarke, Inc.‘s Preflight Engineer, Graphic Designer, Web Designer, and all-around computer guru. He has won several academic and design awards during his career. Print, web design, and tablet applications are his specialties and he knows how to make effective designs that call out to customers.

“One of the most important things for companies wanting to improve Big Data Analysis to remember is…”

Don’t forget about public data. Businesses often focus on collecting data from customers, but there are also all sorts of public data that you can use to help grow your business. For example, there are public reports about employment and income levels, growth of different job industries, weather patterns in your area and other aggregate data that you could use to help you target your customers more effectively. If you sell alarm systems, for example, crime reports for your area can help you target your marketing campaigns. Big Data can become overwhelming — don’t let it. Using data is just a matter of collecting and analyzing statistics that matter to your customers so that you can meet their needs better.

Excerpted from “4 Things Small Businesses Should Know in the Age of Big Data,” via Clarke, Inc.


Tahir MarfaniTahir Marfani

@MeraCRM

Tahir Marfani is an online marketer and SEO expert with MeraCRM. In India each and every business has its own style, size, and needs. In this growing economy, these needs change as the business grows. MeraCRM software is built with flexibility and so it easily mingles itself instead of aligning to the business.

“In order to improve Big Data Analysis, companies should remember…”

Qualitative Big Data Analysis provides deep knowledge about markets, customers, and competitors, allowing companies to make fact-based and relevant decisions.
Big Data Analysis enables them to use previously stored data and evaluate real-time data, providing advanced insights.


Dev TandonDev Tandon

@thekinigroup

As Founder and CEO of The Kini Group, Dev led the development of KiniMetrix, a cloud-based business analytics SaaS helping companies better identify drivers of margin variation and find sustainable margin improvement opportunities. Customers can identify critical issues and opportunities related to sales performance, price/volume/mix, customer churn, price realization, and more.

“Highly-effective data analysis combines two major components…”

1. Organizing data for fast and efficient analysis, and

2. Customizing reports and dashboards for the very specific types of analysis that vary from team to team and company to company.

Companies should therefore focus on improving the visualization of their Big Data, the ease of its drill-up and drill-down capabilities for detailed analysis, and their tools’ abilities to provide the insights they need as quickly as possible.


Christopher PennChristopher Penn

@cspenn

@shiftcomm

Christopher S. Penn is the Vice President of Marketing Technology at SHIFT Communications, a public relations firm, and co-host of the Marketing Over Coffee marketing podcast. He is a Google Analytics Certified Professional and a Google AdWords Certified Professional.

“One of the keys to data analysis — big or small — is…”

Understanding the purpose of analysis. Analysis is all about answering “what”. It’s from the Greek word for “loosen up”. What happened? What are the patterns in the data? Far too many companies deploy Big Data like a giant vacuum cleaner, recording every piece of information in case it might be needed, but never thinking about their analysis strategy:

  • What problem are you trying to solve?
  • What are the KPIs that measure that problem?
  • What are the diagnostic measures which lead to those KPIs?

The second major issue in Big Data Analysis is attempting to get machines to drive insights. Insights are the “why” that accompany the “what” provided by analysis. Take the floors of the Hotel Nikko in San Francisco (no affiliation), for instance. An analysis will tell you that out of the 25 floors, two numbers are missing: 4 and 13. You have the data. You have the what. No machine is going to tell you why that’s the case (Asians view the number 4 as bad luck because it’s a homophone of the word for death). That’s a very small example of “why” being external to the data.


Tyler WaltonTyler Walton

@ClutchSuccess

Tyler is Marketing Manager for Clutch, a customer engagement company that empowers consumer-focused businesses to identify, understand, and motivate their best customers with an advanced consumer management platform. Clutch delivers exceptional customer experience solutions to leading brands like New Balance, Meineke, Pandora, and Rawlings.

“The most effective way to enhance the analysis and utility of your data is to first…”

Account for it all so that it can be unified. Too many companies accept the disparate state of their customer data systems believing that it’s too arduous to connect and centralize it all. This is particularly true with customer data that spans in-store point-of-sale networks, e-commerce platforms, mobile applications, social media accounts, and other systems. Consumer management technology now allows for all of these fragmented, independent channels to be unified and synthesized. This allows the brand to identify and understand their customer behaviors and trends holistically to deliver personalized experiences and motivating engagements.


Joann PerahiaJoann Perahia

Joann Perahia is a Contractor at Systemic Solutions Inc.

“The most important thing for companies wanting to improve Big Data Analysis is to...”

Hire the right staff who understand data and know how to define it properly.

Big Data is just another new word for data analysis,/statistical analysis, but you can’t analyze data properly if you don’t know what you are looking at. Corporate America still has failed, even though technology has not. You take large Corporate America and you have all these different divisions calling the same piece of data different things, so you think they are talking about something different when they are not, or vice versa because no one communicates properly and Corporate America still makes money is spite of itself.


Holly FerroHolly Ferro

@5nerdssoftware

Holly Ferro is the owner of 5 Nerds Software and has extensive experience building custom analytics and projection solutions for clients looking to extract the true value out of their data.

“To improve Big Data Analysis, companies should be aware that…”

Big Data is the hot topic now. It’s a buzzword (buzzphrase?) you hear all the time, but what does it mean? And can it mean something to you?

Big Data is no different from what used to be called “data.” What’s changed is the amount of information you can extrapolate from your data now with modern methodologies, tools, and resources. In the right hands, your data can tell you things about your organization you never even remotely suspected. As a software development firm, we have analyzed Big Data for various companies and built custom analytics and dashboard applications to allow clients to maintain focus on their data. A few clients come to the table completely prepared, but most do not. Why? They have a business to run! Their core competency is not dissecting and interpreting data. That’s ours. Here are some tips to help you get the most out of your big data analysis:

Know what you want
Most companies come to us and ask us to analyze their data, build metrics and models, etc. When we ask what they’re trying to determine, many have no idea. They don’t know what they’re looking for in the data. It’s okay to not know the specifics about what you want analyzed, but what you should know are your main objectives. Are you looking for data to determine where your most profitable customers and/or markets are? Where you’re spending money? Where you’re spending money with no return? Customer complaint levels? Employee productivity? Have at least a 30,000 foot understanding of what you’d like to look for in the data. A good firm should be able to fill in the blanks for you.

Know how to gain access to your data
One of the most common challenges we run into. We’ve had full discovery meetings, set clear objectives, know what we’re looking for and what we want to measure, and much more. Okay,­ now it’s just time to get our hands on the data andŠŠŠ…nobody seems to know how to get us access. Your data lives in a database somewhere. Could be in your office, could be offsite, could be hosted (the “cloud”). Heck, it could be in an Excel Spreadsheet. But it’s somewhere. You need to tap into the right people to determine where it is and how you can get your analysts access. In most cases they’re going to want to get a copy of the data so they can run it through analytics applications. So work on getting outside access to the database or finding a way to export all of your data into some sort of file (CSV, tab delimited, etc.).

Allow for some creative freedom
You might know exactly what you want to look for. But you never know what an analyst will come across as they’re going through their data. Maybe you didn’t know that all of your customers in Oregon pay their invoices 60 days later on average during winter months. Did you? You see DSO go up, but you never knew why. People who look at data regularly are tuned in to patterns and trends. Give them a little freedom and flexibility to bring you the info you didn’t know you didn’t know.

Build living solutions
Data analysis is not a one time thing. You don’t look at your data and know what’s going to be happening a month or year from now. You know what happened in the past, but that’s it. A true solution when looking at Big Data is to build metrics, reports, dashboards, etc. that are fed with your real-time data so that, at any time, you can see how things are looking. You can now start to spot trends and patterns in your business. Data is alive, and you need a living solution to keep up with it.

Use an editing eye
When you first get your hands on some analysis, instinct kicks in and and you decide you want a report sent to you every day that outlines this, another report that shows those trends, a table showing the latest sales, a set of charts and graphs reflecting other data, an alert every time something out of the ordinary happens, and more, and more, and more. This is normal. But trust us, Šyou will become numb to all of this information very soon. You still have a business to run. You can’t spend your entire day (or even just your entire morning) staring at statistics. If you overwhelm yourself with too much information regularly, you might as well not have any information. You need to edit. Pick two, three, four, or MAYBE five key performance indicators (KPIs) that you want to keep an eye on and have that information sent to you regularly. And make sure it’s not five different reports that you have to scan through to get the one number you want. You need a short, concise email with all of your KPIs (and nothing more) in one place. Something you can glance at for 30 seconds to get a feel for the health of the organization. Make smart decisions about what these KPIs are and you’ll rarely ever have to look at anything else. If you are looking for outside help on analyzing your data, it’s important to choose the right firm. All of us have various tools available for analysis. Some of us are versed in building dashboards and other analytics tools that you can use on a daily basis. But very few of us speak your language. Find a firm that you can talk to and explain your need, your pain points, and your measures of success. You need a firm that wants to fully understand what you do, what represents success, and what represents failure before they dig into the data. There’s an old anecdote about an analyst that jumped right into the data for a professional athlete to help him determine which of his techniques and equipment were working best. After delivering a ton of information to the athlete with suggestions of how to achieve the highest scores, the athlete simply said “I’m a golfer.” Don¹t let that happen to you!


Alon RajicAlon Rajic

@FinofinMedia

Alon Rajic is the Managing Director of Finofin.

“The best way for a company to improve Big Data Analysis is to…”

Store data correctly from the start. That means not only making sure every bit of data is stored, but also that the keys used are unified. From my experience with data mining, the most common mistake businesses do is using a multitude of descriptive words to record data, instead of using preset keys and codes.


Marne MartinMarne Martin

@ServicePowerPlc

Marne is an experienced international executive leading transformation and growth for companies in the technology and telecommunication industries. She serves as CEO for ServicePower, which helps field service organizations with innovative, effective mobile workforce management solutions.

“To improve Big Data Analysis, companies must look to…”

‘Connected platforms’ that harvest data from a variety of sources across functions and devices. Data analysis must be more role-based and personalized to capture potentially useful data no matter where in the organisation it comes from. Analysis is no longer confined to functional silos; take the example of the cable television sector and how it deploys people in the field for installs or maintenance. Their engineers’ activities and the data they collect have multiple impacts across departments, including the sales and marketing department, operations, finance, HR, and payroll. Potentially anyone from those areas will benefit from a view of what is happening out in the field.

With the proliferation of enterprise mobility and the new age of the Internet of Things, the field for data capture is becoming much bigger. This presents businesses with an opportunity to potentially transform their businesses. What if a cable TV company can predict a fault in a set top box using analysis of historical trends, or even via an Internet of Things sensor that detects a failing part in real time? They will be able to fix the problem before it becomes a problem for the customer. This proactive customer service would provide a point of difference in a highly competitive industry and ultimately result in more satisfied customers.


Dr. Ernest EaronErnest Earon

@TheDataMapper

@PrecisionHawk

Ernest Earon, PhD, is Founder and CTO of PrecisionHawk, the company that created DataMapper. He has been working in the field of unmanned aerial vehicles (UAVs) and intelligent, autonomous vehicle control for over 10 years. Previously, Dr. Earon served as technical manager at the University of Toronto for UAV architecture for civil applications. Dr. Earon earned his doctorate from the University of Toronto Institute for Aerospace Studies in 2004.

“Companies can improve Big Data Analysis by…”

Reaching customers with limited to no background in data processing and analysis. Algorithm analysis tools are only useful if they are used, so it should be top priority of Big Data companies to create an architecture that’s easy for clients. As a drone and data company focused on agriculture, we know farmers are unlikely to have backgrounds in geospatial, remote sensing, and mapping analysis. For that reason, we created the Algorithm Marketplace, an app store where drone users can upload drone imagery and select which “app” they would like for on-farm insights. We have apps for plant count, plant height, and plant health among many others (and counting) — all with the click of a mouse. Drones comprise a large topic within the Internet of Things and are expected to contribute greatly as a revolutionary Big Data solution. We look forward to improving analysis tools for clients seeking actionable information across industries.