I have used Macs for 15 years, but it still needs to be easier to convince businesses to switch. Or do we need to convince Apple to care more about the needs of companies?
European employers spend up to 2000 USD for a notebook to be used by a software developer. At this budget, a company could easily buy two MacBook Airs, but a MacBook Air would need more RAM to fulfill the needs of many developers.
The cheapest Apple notebook with 64 GB RAM now requires a M3 Max 40-Core GPU and costs at least $3,999 plus tax. This option only has a 512 GB SSD but includes an (overly) giant CPU and GPU performance.
64 GB DDR5 RAM costs 180 USD, and a fast internal 1 TB M2 SSD costs 50 USD at Amazon. IT professionals and managers should know these prices.
I know so many professional software developers and managers who want 64 GB RAM and a 1 TB SSD, but who wouldn’t pay for such high-end CPUs / GPUs like the M3 Max, which is perfectly suitable for enthusiasts, gamers, game developers, video editors, special effects designers, and some scientists.
What’s the consequence? Many software developers use Windows and Linux on their company notebooks and privately buy a MacBook Air without using them to develop software. I’m a fan and happily pay crazy prices for Apple products, but I still don’t find this helpful for everyone who likes Apple.
I have worked with dozens of companies specializing in developing Non-Windows software. And I talk to many people because I am interested in this topic. Fifteen years ago, nearly nobody used a Mac. It has become better. Now, I am one of some, but the market share of Macs is regularly below 10 percent in Western Europe.
Many European employers happily buy iPhones for their employees, but they hesitate regarding Macs, and it’s mainly because of the prices and budgets. They probably don’t trust the MacBook Air M1/2 models although they are extremely fast but don’t provide much RAM. Some companies purchase hundreds of Non-Apple 2000 USD computers, and many of their developers even install Linux desktop distributions on them because they can’t or want to use Windows and they can’t or want to convince their managers to buy Macs because they would have to take the risk for any issues. The employers lose very much time and money because managing and integrating Linux computers is not an easy to handle task for every software developer.
Companies buy for large numbers of users, must maximize their budgets, and are run by executives for whom “good enough” is the guiding light. Commodity Windows boxes and Android phones can be had for cheap. That has never been the market that Apple wants to be in.
I spent my programming career working on Windows front-ends and Unix backends. But when it was time to spend my own money, I bought Macs and iPhones.
P.S. My 2020 M1 MacBook Air with 16GB memory and 1TB ssd ran rings around the “fast” Intel i7 system that it replaced!
I have an honest question: why is the first use case mentioned for Macs that are used in business always developers? Why aren’t the rank and file employees considered?
With the macOS integrated apps like Calendar, especially Contacts and Messages, I can do so much more work in less time on a Mac than I can with a PC it’s not even funny.
I’m in Canada working for a very large employer as a developer. Obviously I can’t speak to the EU market, but can speak to some execu-thinking. The default right now for developers here is whatever the lowest sized SSD is combined with 32 GB. That currently means a Dell Precision but in the past it was an equivalent Lenovo. These machines as spec’d will run 3000-3500 CAD. You can request higher more RAM and more SSD, but you need to be prepared to wait many weeks for your machine to arrive as they don’t stock them in their depo and requisition process is insanely slow. I asked about getting Macs and they’re doing some pilot projects, but everyone on the pilot is pretty unhappy.
The tools being used to monitor employees and prevent data leaks just aren’t as mature, or at least not for us and require tons of manual intervention. It’s especially hard because of how regulated my industry is (finance). Corporate spyware isn’t just a corporate policy, it’s a regulation. This makes standardization on the MS Suite and Windows based machines a no-brainer for the wider firm, and using other options a huge pain from an IT and management perspective.
IMO the Mac didn’t become a viable general business computer until it ran Microsoft Office, and Apple was able to get Mail.app to work reliably with MS Exchange. After that it became an issue of getting businesses to pay for much more expensive hardware.
At my last company I did that by convincing the founder to try a MacBook after he accidentally destroyed his 2 month old ThinkPad. It wasn’t long until we had other executives wanting Macs and asking if they could be used in a new department that was being created.
Bottom line Macs are more expensive and frequently offer no real advantage over a PC, that management can see. You may need a high level manager to champion your cause.
The core of this argument seems to be that Apple needs to eliminate “the Apple tax” - and that’s just not happening. Especially in the age of Apple Silicon. If Apple offered 1 TB/64 GB in the MacBook Air line, it would still be well over $2500 (just extrapolating the prices based on the difference between 16 GB/64 GB in the MacBook Pro prices).
Apple’s focus is on building high-performance machines. I could be wrong, but if you benchmark that $2k laptop vs the $4k Apple one, which performs better?
And do the software devs need a 64 GB machine, or is that just what they’re used to buying because the RAM sticks were historically cheap?
I’ve been retired for five years and at that time the majority of our users were only using mail and calendar and occasionally a spreadsheet on LibraOffice or Excel. Everything else was running in a browser. Serious question, how many people need high-performance machines these day?
In my experience corporate IT managers look at cost not Total Cost of Ownership (TCO) and think that Apple kit is more expensice. However, the lifetime of Apple stuff tends to be three even four times that of supposedly cheaper Windows-based machines. Windows machines are cheaply made and inherently quick to fail — the manufactures need a steady stream of repeat customers who typically have to return every 18 months.
I don’t know, but OP’s challenge seems to be that Apple’s pricing for 64 GB / 1 TB laptops is unreasonably high. If people are buying 64 GB / 1 TB laptops for mail, calendar, and spreadsheets, then they shouldn’t be buying 64 GB / 1 TB machines.
The entry level MacBook Airs are more expensive than cheap junky Windows laptops, but they’re still nicer machines. And the MBA will likely be around in half a dozen years, whereas the Windows laptop will be in the e-waste bin after 3 or 4.
I think that the vast majority of users would be well served for quite a while yet by an M1 Air class computer (16GB RAM, 512GB SSD). On the other hand, I think that for those of us who need more, the answer to the question, “How much more?” tends toward, “As much as possible.”
All joking aside, I’m a web dev and I’m about 99% sure that if my machine were solely a work machine I could probably get by with a base level MacBook Air. Possibly with a 16 GB RAM upgrade.
I’ll always give a vote for more configuration options. We can’t get a base M series with 64GB RAM right now for technical reasons.
I’ve seen plenty of Europeans rolling around with nice work MBPs, though. Somehow they’re getting approved.
Companies who distrust their developers when they say they need certain computers usually have problems with hiring and product quality. So if nothing else, seeing other companies succeed might increase these standard budgets where you’re from.
And don’t get me wrong, me too - I always appreciate configuration options. I’m just wondering who it is that’s going to be buying 64 GB entry-level M series laptops in a sufficient quantity for it to be worth it for Apple to put all that stuff in the SoC.
There may not be enough of a market to expand the Mac line further. As of April 2023 Macs were 8.3% of Apple’s revenue and the MacBook Air is said to be around half of that.
Yes, I’m not really sure. At some level the lack of market opportunity has to constrain them, but the chip design doesn’t seem easy. I think you can buy enough space by cutting two performance cores and 4 GPU cores, but you’re looking to take the four LPDDR5 spots on the Max and somehow fit them on a Pro-sized SoC which currently has two spots. You don’t have room to run the extra two spots along the same edge. Proximity to SLC and GPU is still important. I would think optimizing all that for an in-between chip would be too much of a compromise of the optimization and yield of the Max and Ultra. If they’re trying to push beyond Ultra they really can’t waste any spare minds elsewhere.
If they do figure it out, since it would sit between the entry level and the Pro, they could call it the M3 Power User.
A decent Dell laptop will easily last 3 years and probably up to 5 years for an all in price of about ÂŁ1k an Apple laptop is not going to last more than 6 years, especially as a low end Apple laptop will lose OS support by 7 years, at the latest.
A Dell laptop can be upgraded for RAM and SSD within that 3-5 year time period relatively cheaply and easily.
I’m an Apple fan, but while much is made of how good Apple hardware is (and it is) many companies cannot accommodate this, and Apple adopting their own processors has harmed their case with Corporate buyers instead of helping.