For a $199 consumer printer?
Ah, that’s right. My bad. I stand corrected.
imagePROGRAF Pro-1000. ~$/€1000
Cheaper models don’t have on-site, but mostly 2 years. But isn’t Apple wanting “professionals” to use their MBPs and Mac Pros? Or are they in the end just glorified influencer and blogger tools.
Like Canon is giving different warranties, depending on pricing (and I don’t have one one the really expensive ones), I’d expect more for the MBPs and escpecially Mac Pros.
In the US, anyway, the Pro-1000 has a one year warranty. They agree to do remote diagnosis and let you ship the printer to them or take it to a care facility. There is a two year warranty through Pro Care that gives you access to onsite service which costs $100. Unlike with AppleCare, there are no deductibles.
Canon’s Pro Care programs are really good and I agree Apple could aspire to emulate them for its business coverage. But, their manufacturer’s warranties are similar (at least in the US) and Canon would not be able to scale its Pro Care coverage for immobile printers and carefully held cameras to millions of easily droppable retail consumer products without making some changes.
Just looked it up. European warranties are better.
Completely agree in a verbose manner to avoid 20 characters limit
yes indeed (although if you buy as a professional in Italy the warranty is one year, two is for consumers)
Which is way less I am getting from other manufacturers. Apple is only giving the legal minimum (“defects present at shipping”). And only after being fined.
Also, comparing ‘on-site’ corporate aftercare to consumer aftercare isn’t a level comparison.
HP products are so lousy that once I bought an adapter from USB-C to HDMI that only had a 6 month guarantee… and it worked lousy from day one. Apple’s adapters come with a 1-year warranty so there’s one particular area where HP compares poorly to Apple.
Apple offer a 1 year warranty on all new products
In the US, that is. There are other areas (China, EU) that mandate longer warranties, though prices are often adjusted to reflect that cost.
I agree with you completely. Windows is not necessarily good value for money.
I have a workstation at work that I use exclusively for machine learning because I can have a top of the range Nvidia GPU that supports CUDA. I will not invest in an eGPU just for this purpose and have to access mainly using remote desktop. The workstation cost €4,000 so it is almost as expensive as a Mac Pro - which I would have chosen if it had nVidia support.
As far as the “ultimate” computer, I am very happy with my Mac and would not do normal work on any other system. I use Linux in the cloud all day too as I need to deal with big data (in the magnitude of petabytes), and I need access to a cluster to crunch that. Everything I do is controlled through my trusty Mac though, and all the software I’ve written in the last decade has been done on MacOS because there is nothing better. I am not a big gamer but find my 16" with Vega 56 eGPU meets all my needs (Warcraft and Elder Scrolls are all I play), and I do not even need Bootcamp as I am not interested in playing other games.
I just wish Excel were better, but Parallels is a good enough solution when I need the power features that haven’t been ported.
Similar requierements here. No CUDA on macOS is insane. That’s why I also have a “normal” PC with nVidia.
Offloading huge tasks to a Linux cluster is very easy on macOS. Being a *nix OS, SSH&stuff are built in from the get-go.
As for Excel: this one drives me nuts. Really nuts.
Serious question, slightly off topic, genuine curiosity: Does buying a workstation (or even a locally hosted server) to run non-interactive CPU/GPU workloads make financial sense anymore? We have researches on site here who insist that they need local (sometimes even desktop/desk side) compute resource, but every time we run the numbers, they come out heavily in favour of cloud based services.
On-topic post: To me Macs “officially” run the widest selection of software with the least amount of fuss. When using a trackpad, Mission Control is by far the best way of managing dozens of windows across multiple displays (given that so many workloads are moving into “the cloud”, workspace management is of increasing relative importance for local computing devices).
If I had to choose just one computer to use, it would currently be my 16" MBP, so in that sense, for me, it is the ultimate computer of the day. That’s highly contextual though; when I’m travelling, my iPad Pro jumps into the “ultimate computer” spot even though it’s far less capable of doing some of the sorts of things that I do.
That is a decision that every organization will probably have to make at some point. Capital One has closed two of its data centers already and is closing the third this year. Spinning up servers as needed is making sense for a lot of companies.
“Gartner predicts that by 2025, 80% of enterprises will shut down their traditional data centers. In fact, 10% of organizations already have.”
It depends on the work being done.
For non-interactive, I’m not sure if it is worth the investment. There are a lot of times when the connection to a data centre can be too high latency and having it local can make working much more pleasant, especially with complex imaging and high resolution video, uploading and downloading results can take a long time even with fast internet. I’d say your researchers do have a case though, it can make working much less problematic and more efficient.
In the case of my research the machine is being used for image recognition analysis, so it is interactive. It is hooked up to an audio-visual studio and we’re doing a lot of work on real-time recognition with lidar cameras. If it were my money, I’d say it’s extremely expensive, but we have funding for the research. The cloud is also expensive, we spend hundreds each week for data analytics on AWS.
I use Windows at work, which is regrettable to me. Windows 10 has won a lot of plaudits but I agree with the most oft cited one: “It’s not as bad as it used to be.” macOS Catalina is buggy. Windows 10 is sloppy. Add on additional software from Microsoft and the Windows experience just goes down hill. Excel I find hard to fault. Word perhaps the same (except for what it is). But the latest version of Outlook is stunning in its sloppiness. Then there is OneNote, Teams, and Skype for Business. Sloppy, sloppy, sloppy.
Admittedly having a corporate-lockdown on my Windows laptop makes things even harder, but the basic behaviour of Windows negatively affects me daily.
What an interesting question. If you asked me this 3 or 4 years ago the answer would have been a resounding yes. Now I am not so sure. I am using a Windows laptop to do my dictation. The only thing this laptop does is dictation and it functions adequately, - not particularly a joy to use but it gets the job done.
A lot of my web consumption is done on iPads. They do a great job of keeping notes, and calendars in sync.
I still like typing a a real keyboard and a “big” screen for writing. Scrivener 3, Devonthink, Tinderbox, … I wont’t be finding these on windows and Linux. There are some excellent, well thought out programs on Mac OS X - or whatever its called now. I like the OS because it is smooth and lets me get stuff done with the least amount of friction - once this starts not becoming the case - “buggy” and “sloppy” - what use is it to me.
But, I am still stuck in Sierra. It grates me that I will lose functionality to upgrade hardware and software(I will miss Aperture) - this should really not be the case. I should have visions of joy with the latest and greatest hardware rather than dread figuring out what I am going to lose.
I use Vim for a lot of text editing functions. I have also been playing around with org mode and dabbling with my son’s Raspberry Pi. Would it be so bad to add a linux machine to the mix (or dual boot) and maybe not depend so much on proprietary software. The thought of more and more subscriptions daunts me. What things am I willing to pay higher costs for ? What things are nice, but with a little work and knowledge can I let go of ?
I’m working between two machines at home currently, a work issued laptop (Fujitsu Lifebook) and my MacBook Pro 2017. The Windows machine is sturdy, well-built, good display and comfortable / bulletproof design - better than any ThinkPad I’ve used also to be honest!
But, when I work on the MacBook I can type at immense speeds, I can sit in the back garden in the bright sun, listen pretty great and loud quality music and work for hours before it interrupts me with the whole battery low warning. I find remoting into the Windows machine is the optimal way to work to be honest. I can swipe across to iMessage, my Apple Music library and whatnot on my Mac alongside working. Sure, I could remote in from a Windows machine but would the screen be as bright? Would the battery last as long? That’s the kind of workflow where I find myself thinking “damn, this thing is fantastic”
I find with Windows, if you do one thing at a time, open one programme at a time far apart and avoid it from going into ‘jet fan’ mode, it’s fine. I think the Mac has a smoother ‘temperature / response with fan’ model or graph - possibly due to Apple’s obsession with keeping it quiet at the expense of heat. Whereas I’d imagine most Windows manufacturers, naturally given the diversity of their models, have a more rudimentary step like model, where too much load within a few seconds ramps up the power consumption to power fans fast and activate the Core-i Turbo boost etc to keep things running smoothly and that this affects battery life.
Apple’s advantage being that with making a few models and having more time to fine tune, they’ve got the optimisation down-to-a-tee Really pays off to me.
Wish Windows OEMs got this down as atm, it’s like competing ten years ago: only use apps like Word, keep brightness low, change the power plan to low power mode etc. I can imagine an old school keynote of Bertrand Serlet saying in his accent “Power plans? No end user should ever have to know about that!” lol