Files quota limit exceeded

Hi, I was trying to save a modified a .py file but kept getting a generic error, so I tried to edit the file locally and upload it again, but this time I get a Files quota limit exceeded error. I don’t have that many files (neither in number nor size). Is this a bug?

Thank you for your help

If anyone else has this issue, there is a workaround: just use an s3 bucket that you own (more details here).

My environment is pretty much stuck: it takes more than 4 minutes on any instance to just import TensorFlow, all operations that require saving a file fail.

If I check my account, the system says that I am using 23GB, and I have zero available. But I was never beyond 3-4GB in total, and now I am only using 30MB…

I am trying to use an S3 bucket attached as an external source, but many things still fail.

Hi @igro, could you have a look? I like Datalore but at the moment it’s unusable for me. I just checked, even after my subscription was auto-renewed the system still says I’m using 21GB even though there are only about 30MB in my notebook.

If you help me find out what’s occupying all that storage I’d be happy to delete it, but I can’t see it anywhere.

Hi! Sorry, missed the initial message, investigating.

Responded via Zendesk. Also, my colleague has updated the corresponding tickets in our tracker - there are some known improvement requests related to the Storage Usage report, to make it clearer what the space is being used for.

For the history: thanks to @taglia we found and fixed an issue with the application - in some cases, the storage quota wasn’t recalculated immediately when deleting interactive reports with attached files.

1 Like