Question

I uploading a 1.8Gb CSV file on Google Cloud Storage.

When I start the training from the Google API Explorer I get an error:

{
 "error": {
  "errors": [
   {
    "domain": "global",
    "reason": "invalid",
    "message": "Data size limit exceeded."
   }
  ],
  "code": 400,
  "message": "Data size limit exceeded."
 }
}

I'm confusing. From the FAQ I can read:

What training data does the Prediction API support?
Training data can be provided in one of three ways:
A CSV formatted training data file up to 2.5GB in size, loaded into Google Storage.

And from the pricing page:

Training:
$0.002/MB bulk trained (maximum size of each dataset: 250MB)


What is the difference from this 250MB and the 2.5GB ?

Was it helpful?

Solution 2

It was a small bug in the Google Prediction API.

I posted the question on Google Group and the team fixed the bug pretty quickly: https://groups.google.com/forum/#!msg/prediction-api-discuss/Ap0WbdTco2g/kHoEMbJPteYJ

OTHER TIPS

If you are using for free then

Training: 5MB trained/day

and if you are using paid version then

$0.002/MB bulk trained (maximum size of each dataset: 2.5GB)

The pricing page link which you have sent shows me 2.5 GB.

You seems to be using version 1.5 which has 250 mb of limit, version 1.6 has 2.5gb

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top