BigQuery Analytics Platform

Mprove provides data modelling layer that helps everyone in your company to learn from data faster

Prepare Models for your data with BlockML files

Manage changes under version control

Learn BlockML

Create .view files for each of your data tables

Join views inside .model files

Select and filter model fields in .dashboard files

Split SQL to YAML chunks

YAML is easy to write and read.

Reuse SQL blocks

Create Views and Fields once and reuse them in different Models.

Create Models and reference them in different Reports.

Edit files where you like

Connect your own Git repository and edit YAML files in preferred IDE

Easy test & deploy with Git workflows

Edit data models in test environment. See changes in reports.

Publish to business users when ready.

Powerful Query Builder

Let business users Explore, Filter and Visualize data by themselves

Beautiful Dashboards

Simultaneously filter different reports with the same filter

Pricing

FREE
0$
per Project
All core features
BASIC
50$
per Project
All core features
Dark Theme

Pricing is monthly based, TAX excluded

FAQ

What kind of skills needed to use BlockML?

Basic SQL skills should be enough to start. BlockML is easy to read and learn.

Do i need Google Cloud Account?

Yes. In order to use Mprove products you have to create your own Google Cloud Account project first. You grant Mprove access to run your google project’s BigQuery queries through Service Account.

Who pays for BigQuery when using Mprove Analytics?

You have to pay BigQuery costs on your behalf in addition to Mprove Analytics payments.

BigQuery vs Redshift

Costs

Minimum on-demand pricing for 0.16TB Redshift compute node is 0.25$ per hour. It means to store 160GB of data you have to pay Redshift 180$ per month. With BigQuery you can decouple storage and compute costs. You have to pay 0.02$*160GB=3.2$ per month for storage + 5$ per TB of data used in your queries.

Scale & Speed

Redshift performance is limited by the amount of CPUs you are paying for. BigQuery transparently brings in as many resources as needed (up to 10 000 cores) to run your query in seconds. It’s just amazing.

Maintenance

Keeping your Redshift clusters running well requires extra work:

  • Manually resize your cluster
  • Distribute your data under certain criteria to make some queries work faster
  • Make vacuum operations

Is it worth to move data from AWS to analyze in BigQuery?

Absolutely, if there is no need to move all of this data back to AWS. Network costs for unloading data from the AWS S3 to Bigquery are about 9$ per 100GB.