Mac Pro vs Studio

I need to replace my 2019 Intel Mac Pro which I use for review of huge PDF documents with multiple monitors.

By far the prevailing wisdom on MPU and in all the tech blogs is that the Mac Studio makes sense for all but the most extreme Mac users since the newest Mac Pro costs $3,000+ more and only those doing major video editing or nuclear physics etc need the Mac Pro.

But in looking into this - the speed of an external Thunderbolt SSD is about 2600 MB/S but the speed of an internal PCiE Gen 4 SSd is about 26,000 MB/S. That’s a 10-fold difference - plus no noisy fan needed for an external Thunderbolt case.

In an era where terabyte-size main SSD drives are common for the Mac, I think it’s relevant to ask how to back up such a drive. It takes quite a while for me to back up a large devonthink database or especailly my entire 8TB SSD. With a 26,0000 MB/S internal SSD I could easily do a daily clone of the main drive.

While I agree that one probably needs a business use to justify such a need, I don’t think this is as exotic a use case as almost all of the tech blogs make it out to be.

But I am open to hearing other thoughts if I am mistaken on the performance improvement or otherwise not analyzing this realistically.

1 Like

How much of that 8TB drive changes every day for you? Backup software should only be transferring the changes, not the whole drive each time, even for clone-type backups.

5 Likes

Not only that, but even a full backup at 2,600 MB/s of 8TB of data will take less than an hour. If the backup is scheduled for the middle of the night the time taken will never be noticed.

5 Likes

Agreed; you’ve got to have a workflow that involves regular rapid data transfer to justify the 2023 Mac Pro, I think. Not just a backup or occasional loading of an in-memory database.

Some kind of external PCIe4x16 port on a future Studio would be really nice for single larger drives.

That’s a sweet SSD!

You may find this article useful in helping decide whether or not the extra cost of the pro is justifiable for your use case.

A few Gb changes daily

But those changes are often within several Devonthink databases which are each a few hundred Gb in size. I don’t think even incremental backup software can isolate just the changes within a file - especially something like a Devonthink database which is really a folder rather than a distinct file.

But more importantly - while I do have a NAS and Arq for incremental backups - I have never felt totally comfortable with backups only via that technology because recovering from a complete failure means I have to rely on all of the past incremental backups being valid. I once had a restoration failure with Time Machine where file corruption somewhere in the process meant that I could not restore the drive.

Currently I use a hybrid approach where I have incremental backups of my entire drive plus daily backups of the most critical parts of my data. It works. But I suspect with a 26,000 MB/S internal SSD drive I could pretty easily do an entire drive clone on a daily basis.

My thought is if I use CCC to make the PCIe internal SSD a bootable clone, then with any issue of damage to the main drive I can use boot from the backup SSD and I am back in business with no more than a day’s data lost - and I can get that lost data back from my NAS drive. Unless I am wrong in my intepretation of performance, this seems to me to be a really nice setup for redundancy. And I think this is a use case applicable to many people who use a computer as an essential work tool with lots of regularly changing documents. I don’t think it’s a use case just for video editors and nuclear scientists.

Thank you @liminal

That’s one of the articles I read - it is typical of many with this comment:

Technically, the PCIe slots do make it more versatile and internally upgradable than the Mac Studio. It’s a tiny victory, though.

While lots of tech authors say that I must really question whether an increase from 2600 MB/S to 26,000 MB/S backup speed is a “tiny victory.”

Perhaps the initial reviews came out before OWC had released the specs on the 8M2 so the disinction in new Mac Pro PCIe capability over Thunderbolt was not as clear? And then everyone has repeated that without considering the capability of new generation internal SSDs?

It takes much longer than that. I think you may be confusing megabits vs. megabytes which is a difference of a factor of at least 8 - and maybe a factor of 9 if there is an extra parity bit

1 Like

DEVONthink is actually many (thousands/millions or more) files (in original format) inside a folder structure that DEVONthink creates and understands, presented as a macOS “package” which looks like one file. All files probably don’t change all the time. Just a few which probably suits increment backup–a hunch. Best to Backup DEVONthink when DEVONthink not in use. I close it and allow backups at night. When/if I recover, I’ll look for a night-time backup.

I think your units are off. First, you shouldn’t be using capital Bs in the post if you meant bits. But your TB enclosure should be writing in the ballpark of 1.5 gigabytes per second right now, which should take ~90 minutes or less, like @tomalmy said.

1 Like

You may have a valid use case for purchasing a Mac Pro. From what I’ve read the only reason to purchase a Mac Pro these days is PCI storage. It tops out at 192GB of RAM which isn’t enough for many professionals. It can’t match the processing power of the newest Intel and AMD chips and you can’t add 3rd party graphics cards. Since Apple has apparently abandoned the very high end some wonder if today’s Mac Pro may be the last one produced.

Businesses have always relied on incremental backups. Very few have so little data that they could ever do a complete backup on a daily basis. At my last job I probably could not have done a complete backup without shutting down the business for a week or more. My backup window was 11:00PM to 6:00AM.

I relied on all my past incremental backups being valid by testing them on a regular basis. I still do that with my personal data. I check my Arq logs every day and every few weeks I restore a few files and open them to insure they are good. IMO this is crucial regardless of the backup method used.

Ultimately, it’s your money and your decision. :+1:

4 Likes

Perhaps my units are incorrect - I agree there is much confusion in tech specs between megbytes and megabits and it can be easy to make that mistake.

That said - this is a log of incremental backups currently from my 4M2 on my 2019 Mac Pro - which is advertised as “up to 2800 MB/S”. Whatever the units - this performance is similar to what is advertised for the new Mac Pro with Thunderbird. The 8M4 internal SSD is advertised as about 10 times faster.

It would take several hours to transfers 8TB on the current SSD. That’s not easy to do as I shut down Devonthink during the backup and there are not that many hours per day Devonthink is not being accessed by someone.

Something else is going on there. That’s only around 50 megabytes per second. It could be because it’s copying tons of small files, or another issue.

Totally agreed that it’s hard to think about all the units involved–and we aren’t even invoking MTps/GTps yet.

2 Likes

I forgot to ask, is ARQ your offsite backup solution?

I forgot to ask, is ARQ your offsite backup solution?

Yes it is

Though I am considering a second Synology instead (one in home office, another in real office)

1 Like

Agreed - the difference is most businesses have full-time IT departments. I am my own IT department - especially if/when things fail. The ability to have a cloned main SSD at all times would be really appealing to me.

1 Like

I would prefer that too. I was only pointing out that incremental backups are an accepted standard.

3 Likes

You don’t have to justify the Mac Pro, it’s awesome. If you want it and you’ve got the means, get it.

5 Likes

It also depends on what checks are being done on the data, encryption, etc. The writes are limited based on how fast the CPU can process the data.

But I agree - I’d consider how much the data actually changes, or whether you need the external storage to be fast for reading purposes. It’s not just writes. DT database changes are going to be a non-issue.

CCC’s documentation warns people off of the idea of relying on making a “bootable clone” with current macOS versions. Creating legacy bootable copies of macOS (Big Sur and later) | Carbon Copy Cloner | Bombich Software, most specifically:

we do not support nor recommend making bootable copies of the system as part of a backup strategy.

If you have a plan to address that, that’s cool - just be aware that CCC on ARM Macs has a number of “gotchas” around the strategy you’re considering.

2 Likes

That would be my guess. I’ve encountered that problem on Mac, Windows, and Linux systems.

“With small files like 4K, you can spend more time finding, opening and closing the file than you do reading or writing the data.”

1 Like