.view files for each of your data tables
Join views inside
Select and filter model fields in
YAML is easy to write and read.
Create Views and Fields once and reuse them in different Models.
Create Models and reference them in different Reports.
Connect your own Git repository and edit YAML files in preferred IDE
Edit data models in test environment. See changes in reports.
Publish to business users when ready.
Pricing is monthly based, TAX excluded
Basic SQL skills should be enough to start. BlockML is easy to read and learn.
Yes. In order to use Mprove products you have to create your own Google Cloud Account project first. You grant Mprove access to run your google project’s BigQuery queries through Service Account.
You have to pay BigQuery costs on your behalf in addition to Mprove Analytics payments.
Minimum on-demand pricing for 0.16TB Redshift compute node is 0.25$ per hour. It means to store 160GB of data you have to pay Redshift 180$ per month. With BigQuery you can decouple storage and compute costs. You have to pay 0.02$*160GB=3.2$ per month for storage + 5$ per TB of data used in your queries.
Redshift performance is limited by the amount of CPUs you are paying for. BigQuery transparently brings in as many resources as needed (up to 10 000 cores) to run your query in seconds. It’s just amazing.
Keeping your Redshift clusters running well requires extra work:
Absolutely, if there is no need to move all of this data back to AWS. Network costs for unloading data from the AWS S3 to Bigquery are about 9$ per 100GB.