Extending the functionality of the iPad using external hardware has been done for a long time, and the most popular addon is probably a real bluetooth keyboard. By adding this piece of hardware, the iPad is immediately transformed from a simple entertainment device to a real productivity tool1.
If you are going to read one review of Microsoft Surface 2, make it this one by Lukas Mathis who is a long-time Mac user.
Another difference between the Surface and an iPad is the Surface’s split screen mode. iPad owners often note that the iPad’s «one app owns the screen» system is a good idea, since people can’t multitask anyway. But that ignores that people often need multiple apps to work on a single task. I can’t count the instances where I’ve used split screen mode just in the last few days. I’m in a meeting, taking notes in OneNote while looking at last week’s meeting notes. I’m responding to an email while looking at a spec. I’m making a drawing while looking at a reference. I’m changing a mockup based on feedback in an email. I’m taking notes during a Skype call.
This is definitely one of my main issues with the iPad as well. Not being able to research something while writing at the same time is a big pain. Pasting the research content into the writing app is a poor way of solving the problem.
What I would like to see is something like what Microsoft have done with Surface, but with an Apple twist. Something in the lines of having a main app which runs in normal iPad mode, and the ability to run a second app in 1/4 of the space in landscape mode. The twist being that the secondary app has to be a multi-platform app, and when bringing it up in secondary mode, its iPhone user interface is shown.
This could fit quite nicely and would be a terrific asset, and I can imagine a lot of use cases where this would be a fantastic way of getting things done.
The problem with Metro might not be that it’s performing badly at its intended function. The problem might simply be that, unlike me, most people don’t want to use their tablets for productivity. They’d rather keep using their old Windows PC for that, and also have an iPad for watching movies and playing games.
This is a valid and fair point. Having the ability to distinctly separate devices for work and play can definitely bring peace of mind and the ability to focus better1.
I personally prefer using the iPad for as many things as possible2, including reading, answering email and using productivity tools such as OmniFocus and the calendar. When in serious “work mode” though, nothing beats the MacBook Air.
Don’t add your work email to your iPad if you only use it for personal tasks and entertainment though. ↩
Information regarding the vulnerability is currently terribly scarce, but judging by the information in the Apple KB, it sounds very serious indeed and would allow man-in-the-middle attacks on SSL/TLS connections.
What this means in reality is that someone who sits between you and a target site, such as your bank or Facebook, would be able to listen in on your traffic and potentially modify information as it is being sent to the server.
A few days ago, Apple celebrated the 30 year anniversary of the Macintosh. To celebrate, here is a keynote from 1984 featuring Steve Jobs presenting the first ever Macintosh to the Boston Computer Society.
You get to see Steve when Steve became the Steve Jobs. Seeing him smiling up there is the way a lot of us would like to remember him.
When Apple first released Photo Stream as part of their iCloud service, I was excited to finally have all my photos automatically transferred between my devices. They were in addition automatically backed up to my Mac, which meant that the need to sync my iPhone to iTunes would be a thing of the past.
What I failed to realize at the time was that although automatically backing up all photos to my Mac was a breeze, there was no convenient way to view older photos the way they were meant to be viewed – on the crisp Retina Display on my iPad.
There are services which have tried to achieve ubiquitous access to all photos, and Everpix was just that kind of service. Once configured, it was basically a set-and-forget solution where all photos were automatically uploaded to their servers. If you followed the above link, you will notice that they are no longer in service since they apparently ran out of money.
I found another solution to my problem, and I think you have heard of this service before. It comes from Yahoo and is called Flickr.
In a recent Flickr for iOS update, the ability to automatically upload captured photos to a private set was added. This gives you the same set-and-forget setup that Everpix once brought, and with 1 TB for free you will undoubtably last a very long time without running out of space.
The problem with the Flickr iOS app is still the viewing part however, which is why I bought Flickring for iPhone and iPad. It connects to your Flickr account and shows your sets and photo stream in a beautiful way.
Since both Aperture and Lightroom support publishing to Flickr, you will always have access to all your photos taken with your traditional camera as well, as long as you have internet access or have synced the photos for offline viewing using Flickring.
The new iPad Mini with Retina Display was silently released by Apple earlier today.
There have been reports of a low initial stock, so be sure to order one online immediately if you want that retina goodness in time for Christmas. If you are lucky enough to live in the US, you can schedule a pickup from your local Apple Store today.
I went ahead and ordered the space grey 32 GB LTE model, together with a product red Smart Cover. Estimated ship dates in Sweden seem to be set for the first week in December, even though apple store says 5-10 business days.
As you will indubitably have heard, Apple just released the next major version of their operating system. Having run out of cats to name the releases, they have now switched to Californian landmarks, and the first to be Applified is Mavericks.
If there is one review of Mac OS 10.9 you should read, it’s the one by John Siracusa for Ars Technica.
One of the least used features on my iPhone has traditionally been Spotlight search, located on the left most home screen. You know the one you accidentally swipe to when you are really looking for something else.
Things have changed considerably for iOS 7. There is no longer a specific screen for Spotlight search; it is in fact part of all home screens, and can be activated by swiping down with one finger anywhere on the screen, except the top and obviously the bottom edges.
Spotlight is used to find things, fast. It will find anything in your calendar, contacts and even in email and notes. It has another feature as well; you can type in the name of any installed app, and it will find it for you. With one single tap, it can subsequently be launched.
Why even bother with this?
The brilliance behind this approach to launching apps lies in the fact that you can only fit a limited number of apps on the first page of your home screen. If you are like me, the first page contains the most used apps, while the rest are tucked away neatly (or perhaps more at random) in some folder where you will never find it again.
What this all means is that instead of swiping to the correct screen and opening the correct folder to find the application you are looking for, just casually swipe down and type in the first few characters of the app’s name and then launch it. Spotlight will even learn which applications you most often search for and display them on top.
Launching apps have never been easier. I would however, like Apple to take this to the next level and have a dedicated home screen far left, which will automatically populate with the most used apps, and perhaps frequently used contacts.