Let’s start with why.

Why do we need geospatial analysis? Why does it matter now more than ever?

Human brain recognizes information mostly based on time, date, and location. For example, “Remember we met yesterday at the theatre?” or “Where are you now?” followed by “I am at the university”.

With the help of today’s technologies — cellphones, GPS, social media, and more — one can collect geo-referenced (time and place) data of practically any event occurring around them, or even globally.

Geospatial analysis is leveraging this data into building maps, cartograms, statistics, and graphs, to show how changes are taking place and where exactly. Representing data on a large scale can reveal major transformation evolving around us, such as climate change. Applications of geospatial analysis include — but not limited to — demands-supply matching, climate change modeling, geo-marketing, better ETA estimations, and human population forecasting.

Today, companies mostly use open source tools like Kepler.gl and QGIS. While these tools are really good for certain aspects of geospatial analysis, they need to work in other aspects. Let’s explore it in a bit of detail.


Kepler is a web-based platform to visualize large-scale location data. It was created by the visualization team at Uber with the mission to create industry-grade open source frameworks to supercharge big data. There are four major suits available — Deck, Luma, React map and React vis — to make beautiful data-driven maps. Kepler was built with deck.gl and utilizes WebGL (A JavaScript API) to render large data faster and efficiently.

Kepler is merely a visualization tool. But just like robots, it is damn good at doing that particular job efficiently. It takes CSV, JSON, and geoJSON files. The basic flow of Kepler is you perform some operations on your database on your local machine, download if not on the local machine and plot it on Kepler.

To make these visualizations more informative it offers several layers that you can choose from and it’s super easy. For example, to use the point layer, you’ll have to select latitude and longitude columns from your data and voila! And not just that, you can add multiple layers on top of one another. Isn’t that cool?

In addition to that, you can add filters (time, date, or maybe size), choose colors, set opacity, set outline for the points on the map and choose to view inbuilt map layers such as land, road, water, and 3D buildings. It also has a compare view which you can enable and visualize two maps side by side.


Geospatial analysis is much more than just beautiful visualizations. And as I said earlier, Kepler only offers visualization at the moment. Many important features are absent which should be there in a full-pack geospatial analysis tool. Some of them are:

  • You always have to add data files (spreadsheets or geoJSON) manually and you can’t take any realtime decisions.
  • Spatial operations — which is an important aspect of geospatial analysis tool— like merge, join, and clustering is not available in Kepler. Not only merge and join but simpler calculation has to be done outside and then put into a CS system.
  • In Kepler, adding different kinds of maps is not possible to get more context.
  • As present, they offer their own set of colors to choose from for your visualizations, which for me, is a set-back. They have a lot of options, can’t deny that, but I like to choose the colors I want in my visualization *shrug*

But wait a minute…

The 2019 roadmap for the development of Kepler promises a lot of new features including advanced layers, spatial operations and customize color scale. Catch the full list and updates here.


QGIS is an entirely open-sourced desktop software maintained by the community. Installing and using QGIS costs zero money but it is challenging unless you have done it before, especially if you have conflicting packages from other software. If you are familiar with programming you can even add features by yourself. It consumes almost all type of data (more than 70 vector formats). It is cross-platform and available on all three major operating systems Windows, Mac, and Linux.

QGIS is like the excel of the geospatial world. You can do almost anything from creating maps, performing spatial operations to using a database with QGIS. It can be used for really large projects, everything from start to finish, if you have a GIS project and no other types of data. It offers some plugins that can add an extra star to your project, some of these are,

  1. QuickOSM — allows downloading OpenStreetMap data
  2. QuickOSM Query Engine — Helps you download specific data from QuickOSM plugin
  3. Data Plotly — Allows creating D3 plots of vector data


QGIS is by far the most enriched and useful tool for geospatial analysis. However, just like every other tool, it has its own set of limitations. And we at Locale believe, it’s not just tools, geospatial data comes with its own bucket full of challenges.

Installation and managing QGIS is very different unless you have done it especially if you conflicting packages from other software. QGIS struggles with processing large data and doesn’t really provide support for streaming data. Performing geospatial queries on streaming data become very compute-intensive and we don’t have tools that support it. Although sometimes, developers create their own set of internal tools but those are not suitable for global use, as it focuses on a particular task that they are performing.

Moreover, It works well when connected to a database but it is a desktop application and not on the cloud. Also, If a business person wants to use QGIS, it’s not very intuitive as it requires spatial data knowledge. The analysis can not be shared with anyone can’t work with enterprises scale and more modern methods of aggregation like geohashes or hexagonal grids. To know what these mean, please check the blog out:

Spatial Modelling Tidbits: Hexbins vs Geohashes?
Why we at Locale.ai are fond of hexagonal grids?


So what is Locale and why we built it?

It all started with a personal problem. As data scientists working with geospatial data, the existing analytics products were futile in our daily workflows. Hence, we had to build our own tools and libraries for our everyday workflows.

To know the kind of problems that come with dealing with large scale, high-frequency location data, check this out:

A Product for Operational Analytics using Geospatial Data!
What led to the birth of Locale.ai?

While doing that, we realized that it was not just us. Data scientists around the globe face similar problems when it comes to location data. As a result, businesses are struggling to attain operational efficiency. At Locale, we plan to solve this problem once and for all.

What we are trying to do is making location analysis a part of your entire analysis ecosystem. Because of course, Location intelligence is so much more than tracking and plotting points on a map. You can mix and match spatial as well as nonspatial databases together and answer all your questions. Our aim is to make business users get spatial insights without having to depend on developers.

Location intelligence is so much more than tracking and plotting points on a map.

With our platform, you can stream spatial data at scale, meaning it all happens at scale and in real-time and at very granular levels. We have also added a drag & drop interface to write complex queries for business users (who don't know how to write complex spatial queries) to get answers to their questions. You can easily customize your spatial models with Locale’s platform.

Read Similar:

Geospatial Clustering: Types and Use Cases
Deep dive into all the different kinds of clustering with their use cases.