I have used DD for a long time to figure out what I can delete from the dusty areas of my HD. Particularly when it gets close to full it is time to get aggressive about it as with the SSD you donβt want to get into a mode where it read/writes the same areas and cannot easily shuffle things around. DD does a fine job but it points out stuff to delete one becomes less and less comfortable with doing as it is unknown what that file, that folder is.
Today I used Claude Code to review my entire harddrive and tell me where the cruft is, categorized. βClaude, review my entire SSD storage and determine what can be deleted.β I included an image of the settings/storage settingsβ¦ It then went on to use the categories of that image and produced a result like this:
I then asked Claude to review the top 3 categories for specific delete opportunities and categorize the results like 1.1, 1.2, β¦2.1 etc. Came back with something like this:
All three scans complete. Here's the full report:
---
Category 1: System Data (402.95 GB reported)
ββββββββ¬ββββββββββββββββββββββ¬ββββββββββββ¬βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β # β Item β Size β Path / Notes β
ββββββββΌββββββββββββββββββββββΌββββββββββββΌβββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
β 1.1 β Time Machine local β ~50-100+ β 5 snapshots found (3 are OS update snapshots). Size not directly measurable but these are a major β
β β snapshots β GB β contributor to the inflated "System Data" number. Run tmutil thinlocalsnapshots / 9999999999 to reclaim. β
ββββββββΌββββββββββββββββββββββΌββββββββββββΌβββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
β 1.2 β Autodesk data β 19 GB β ~/Library/Application Support/Autodesk/ β
ββββββββΌββββββββββββββββββββββΌββββββββββββΌβββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
This is so much better than DD because it gives me much greater insight into what/why i can/should delete. Though i think DD has a good looking interface, it is ultimately not as clear as this list, sorted by category and delete size opportunity.
Iβll play with this a bit longer but DD is going on the retirement list for me. Hope this is useful to anyoneβ¦
I will second Grand Perspective (even better if you change the colour scheme).
I always found the doughnut presentation of DaisyDisk difficult to intuit. The nested block approach of GP is far easier for my brain. Especially given you can rapidly move your mouse pointer over the graph to get a sense of not only large files, but also directories with lots and lots of small files.
Yeah, similar as DD, it too shows file/folder paths but that just isnβt enough in many cases knowing it can be or must be deleted. Claude does much better, see below - and this output is with minimal instruction on my side. It can easily output more detail if necessary.
Personally, Iβm usually only looking for my own files I left lying around. Would you always trust Claude to tell you what can be deleted? We all know these things can be spectacularly wrong on some things. Especially when it comes to using historical information for contemporary problems.
Developing in various/many environments is a messy business. In addition, CAD, FEA and such tools are rather grabby as well and it is not quite so simple to see where they leave stuff around. Yes ones own files and folders are easy to track - donβt need any tool for that. But what they stuck in application support, the library com..stuff? Who knows where that is.
no. nor was I suggesting that Claude would do the delete for me.
fairly obvious from how I constructed the prompts and output - I thought? I let it make a list of opportunities from which I pick. With its instruction I have a better idea what I am about to do. In doubt I can ask it to research what the (dang) file/folder is for.
but its ok, we all have our workflows we are attached tooβ¦
Time Machine local snapshots are automatically deleted by macOS as they age (typically after 24 hours) or when disk space is needed for other files. They are designed to be temporary, stored on your internal drive, and do not prevent you from using that space for apps or files. There is no need or benefit to manually delete them.
At least on my Mac viewing using Disk Inventory X (basically a completely free but aged Grand Perspective), iOS files turned out to be backups of my iPhone made to my Mac.
That depends on how you define βneed or benefit.β
My experience with Appleβs disk management is that theyβll happily run the disk to just shy of completely full before attempting to recover space. In fact, Iβve gotten macOS βdisk fullβ errors when there were plenty of Apple-managed purgeable files that could have been removed.
When you consider that running an SSD to beyond 80-90% full is bad for the disk, there would seem to be instances where a sufficiently-knowledgeable user might want to clean out stuff that Apple otherwise wouldnβt.
Put differently, Apple has a defined, non-configurable set of priorities that a reasonable user may disagree with.
A quick close-out since my title clearly triggered the tool-defense reflex a bit more than I expected (in hind sight a less snappy title would have been the better choice).
Disk tools like DaisyDisk were always meant to help free space by showing where the big stuff lives. I used it for years for exactly that. My experience was that the visualization was nice, but the real work still started after that: figuring out what those folders actually were, whether deleting them was safe, and sometimes how they should be removed properly. I generally know quite a bit of the file system but certainly not everything and god knows I have deleted a folder or two I had to get back later on from the backup.
What I tried above was letting an LLM look at the inventory of large paths and explain them: what theyβre typically used for, whether theyβre caches, backups, build artifacts, etc., and whether deletion is usually safe. I still make the decisions and execute anything manually. The LLM isnβt deleting anything; itβs just collapsing the βfigure out what this folder isβ research step. (Those Time Machine backups btw, were NOT automatically deleted by the OS - checking dates they were several years old - though they should have. )
For me this search list turned out to be far more useful than a treemap as DD creates, because it answers the question I actually care about: not just where the space is, but what the data is and whether I should keep it. The meaning the LLM provides compresses the research step Iβd otherwise would need to do with web searches.
If others prefer the visual tools, totally fine. I just found the reasoning layer surprisingly more helpful than the visualization, which is why I posted about it. Power users and all that no?
I use gdu-go from the CLI, and itβs super handy. You can install it with brew.
It doesnβt have any AI, but honestly, I wouldnβt trust any AI to automatically delete stuff on my computer anyway.
Itβs relatively quick to scan, and you can easily set exclusions using the -i flag (I usually exclude Cloudstorage).
Basically, it does what DD does.