SharePoint storage nearing quota - how are you handling this at scale?
Posted by hakdugka@reddit | sysadmin | View on Reddit | 13 comments
We’re running into SharePoint storage limits across multiple tenants and trying to figure out the most efficient way to handle it.
Right now, I’m using scripts to scan and analyze storage usage, but it’s extremely slow - it can take days just to process one tenant. This obviously doesn’t help much.
We already considered AvePoint, but management decided not to move forward with it.
For those managing multiple tenants (MSP setup or similar):
Are you using scripts, or whatever..
Any best practices to avoid full tenant scans or speed things up?
IronWoodWorking@reddit
All of the responses here are good, and while the new data lifecycle options for version control is great. It seems to not always impact historical data and so look into this for version cleanup jobs. Need to take into account the retention schedule and legal holds to ensure the org is aligned, but for the most part it's just old versions and no data is truly being lost. I would wager that version history is 50%+ of every tenant's storage because of how they calculate it as full copies for every version.
Trim existing versions on site, library, or OneDrive - SharePoint in Microsoft 365 | Microsoft Learn
Previous-Low4715@reddit
If you buy a single copilot 365 license you can unlock advanced management for SharePoint for your entire tenant (MS want to fudge the uptake numbers I guess) which includes SharePoint lifecycle management and archiving sites off to azure storage for a fraction of the cost
Master-IT-All@reddit
Go in and ensure that Versioning is configured. It defaults to up to 500 forever.
And then prune versions. Which does take a long time. But only needs to be done once.
Brought a customer with 2.7TB of data down to 2.1 doing this.
SharePoint Archiving is also an option to reduce space, but currently it's site based so you'd have needed to ahve configured the sites into a scheme that supports archiving. File level is supposedly coming at some point.
PorreKaj@reddit
We went from 102TB to 54TB by setting versioning to automatic from 1000 across all sites.
https://www.reddit.com/r/sysadmin/comments/1sypzp5/i_did_the_thing_sharepoint_versioning_cleanup/?utm_source=share&utm_medium=web3x&utm_name=web3xcss&utm_term=1&utm_content=share_button
Prestigious_Rabbit30@reddit
Also, review the Renewal Timeframes for sites. The default is often set for 180 or 365 days from creation/last activity, which could mean you have sites sitting idle for a year before it is flagged for deletion (and it takes one person to do something in the site, for the renewal timeframe to reset for another year).
You can set the renewal timeframe to e.g. 90 days (or 180 days max), and then owners need to manually renew the sites when it comes up for renewal/deletion (they will receive an email to indicate that they should renew site XYZ before by when it will be auto-deleted, or they can opt to delete the site from the email if not in use anymore).
You should be able to get a report from the Admin centre on what the last activity date on all SharePoint sites were, to get an idea how many idle/unused sites you sit with, and if reducing the renewal timeframe will make a difference.
purplemonkeymad@reddit
Typically I look at the sites that are using the most then open up the Storage Metrics page for that site (no need for a script, it's built in) at:/_layouts/15/storman.aspx
Even works for onedrives (as an admin.)
GremlinNZ@reddit
Most clients where storage capacity was raised just elected for more storage. Cost of staff going through and cleaning up? Nah... More space. For some, it just kept going up adding 50-100GB every month or two.
Prior to implementing HaloPSA that could report on storage (because unless Microsoft sends you an email to tell you you're out of space) because you can't report tenant wide natively.
Each site has a default quota of 20TB+, way more than the total storage most have.
So the previous solution was a power automate app to report on storage 3 times a week.
KavyaJune@reddit
For single tenant, script will be useful. However, if you’re managing multiple tenants, using a centralized tool like AdminDroid can provide better visibility.
https://admindroid.com/
You can have a look at PHL (Preservation Hold Library). It stores copies of content that are modified or deleted when retention policies or legal holds are in place. Also, check version history. Versioning is one of the most common reasons for high storage usage. You can check version history consumption and clean up.
bit0n@reddit
Same here. AdminDroid to find the issues then a limit put in place to fix it. Like the 200mb presentation with 99 versions. Not to mention the fact the Marketing guy saved v1 v2 v3 etc which all had multiple revisions.
theoreoman@reddit
Find out of its a handful of people just ussikythe system inappropriately. One guy was uploading uncompressed 4k videos
xMcRaemanx@reddit
Ideally Purview licenses with data lifecycle management policies alongside versioning limits and retention policies.
It's worth doing a quick manual review of the largest sites using the builtin reporting. We were nearing our limit and I was pretty quickly able to determine there were certain shared files in use whose previous versions were massive space takers.
The_Koplin@reddit
I am not a SharePoint person, sites have quotas?
Give xyz amount and let the users manage it. Not sure what a script does in this case.
I have a few users that make use of SharePoint and one 'extreme' user that was responsible for using something like 95% of all use in the system. I got capacity warning emails. After that I set a hard limit. They managed to prioritize the data they needed. The other option they were given was to pay the cost directly for each extra gig needed. IE billed the space back to their budget. Something like $200 per TB/month.
HankMardukasNY@reddit
https://learn.microsoft.com/en-us/microsoft-365/admin/activity-reports/sharepoint-storage-reports?view=o365-worldwide