My iMac has started resetting my mouse's tracking speed upon every restart. While somewhat frustrating, it's pretty easy to open up System Preferences -> Mouse, and update the tracking speed to one notch below "Fast" and get on with my work.
While it's gotten a little old, it also got me to thinking: why does Apple measure mouse movement in terms of "Tracking speed"? And what is tracking speed, anyways?
After doing a "fair bit" of research (read: jumping to the mouse speed section on Wikipedia), I encountered an interestingly named measurement called "Mickeys per second" (tee hee). It makes some sense: according to Wikipedia, it measures "the ratio between how many pixels the cursor moves on the screen and how far the mouse moves on the mouse pad."
While, at some point in the past, this might have been a completely sensible measurement, we've moved somewhat beyond pixels. Pixels used to be visible to the naked eye, but with today's 4K and 5K displays, that's no longer true. What also struck me was that, unless Mickeys per second could change with each display, setups with multiple displays would need a variable number of Mickeys per second to render a constant speed mouse pointer (at least in physical space). Obviously, behind the scenes, modern operating systems are flexible and take account of this, but this dynamic behavior is hidden from the end user.
Let's go back to the beginning. Here I am, updating my tracking speed a couple times a week. When I do something more than once, my instinct is to find a way to stop doing it. Ultimately, I came to the conclusion that we (or really, Apple or Microsoft) are thinking about this in the wrong way.
Think about it. Mouse velocity comes down to three things:
The "reach" of the user's hand (i.e., the maximum distance the center of the mouse sensor can be moved by the user from one side to the other).
The size of the screen.
The "intent" distance (i.e., the smallest intentional movement a user can make)
Without taking into user comfort level, the absolute minimum for this hypothetical measurement should be one screen per reach. No matter how good you are with computers, it's a bad experience if you need to lift up your mouse several times to position the cursor in the right place. At maximum, it again needs to be user-specific. If mouse control is erratic or difficult for the user, the intent value should be larger than for someone with good hand dexterity.
Since we don't want to move the mouse at all until the mouse has moved at least the intent distance, and we don't want to move it less than the screen size, we can determine an upper bound and lower bound for mouse velocity. Furthermore, extracting these values doesn't entail asking the user to drag a marker on a screen for some arbitrary indicator.
It would be relatively easy to determine these values automatically based on a simple tool: just ask the user to move the mouse from side-to-side, and then display a grid to the user, prompting them to click between two points as close to each other as possible. Since screen size is already known by the OS, it would just be a simple matter of crunching the numbers to an internal value.
There's probably lots of bigger fish to fry on Apple's Mac OS X team, but I think it'd be a huge improvement to the user experience and would make this setting a lot less opaque to end users.
Marco Arment released an iOS 9 Content Blocker on Wednesday and it quickly rocketed up to #1 on the Paid App Store charts. At that ranking, apps can pull in tens of thousands of dollars per day.
For very good reasons, he removed it from sale this morning. And then things got nasty.
Thing is, I think I understand what this week has been like for Marco. In a lot of ways, it reminds me of what happened when a site I wrote went viral a few years ago.
If you've never made something that's gone viral before, let me break down how it feels like for the maker:
You're in shock. You can't believe something you made resonated with so many people.
You freak out because your inbox has become a disaster.
You try to get some work done and deal with the explosion of feature requests and attention.
You start reading online and notice people are saying some really shitty things about you.
At this point you can't think about anything else except the shit that people are giving you.
You want to shut it all down and forget about it.
For anyone who thinks Marco thought this through or planned it in advance, you're just deluding yourself. Who in the world could anticipate making a #1 rated app in the App Store. I don't think any independent developer has ever done this outside the context of a game.
If I were Marco, I'd have been feeling dejected and depressed by this morning and would've wanted all of the attention to go away. No one person can deal with it alone.
In his post, Marco specifically pointed readers to instructions on how to get App Store refunds. I mean, people, if he was actually trying to scam you, don't you think he'd just leave the app up on the store and just never say anything else about it? At least he's being honest. If you already purchased the app, it's not like you didn't get anything in return.
You bought a working app. It still works. It will continue to work. There is no scam. Marco isn't being an asshole. If he's anything like me, he's overwhelmed and just wants to get back to normal life. Being the center of crosshairs can really suck.
Be a little more empathetic. Thank Marco for being honest. Respect his decision. He's just a person like you and me. I realize how easy is can be to forget that when everyone is just an avatar, but please take it to heart. I wish more people did when a similar thing happened to me 4 years ago.
We're vacationing in Whistler, BC right now as "endurance spectators" to my father-in-law's 3rd Ironman triathlon. Expecting some beautiful landscapes and weather, I brought my newly acquired X100T to take some nice photos.
Yesterday, I set it up on an interval timer and pointed it right towards Rainbow Mountain, which faces the patio in the kitchen of the little condo unit we're renting out. After all was said and done, I ended up with 400 images depicting clouds moving over a mountain peak and not much idea of what to do with them. So, as any self-respecting engineer would, I set out to create a time-lapse using only my trusty command-line tools: FFmpeg and ImageMagick.
Let's get down to it.
Note: Everything in this tutorial assumes that you have a current copy of ImageMagick and FFmpeg installed on your machine.
Even though I turned off RAW on the X100T, the images were still pretty huge (4896x3264). During my first tests, making movies from images this large gave really inconsistent results and took a long time to create, with not much extra benefit.
Therefore, the first thing you should probably do is check the size of your images and, if necessary, resize them to be a bit smaller so they will play more nicely with FFmpeg and any other image manipulation that you're going to do.
Since I planned to upload my video to YouTube, I referenced a handy page they have that lists out their preferred resolutions, codecs, and formats for upload (https://support.google.com/youtube/answer/1722171?hl=en). If you're like me, and you don't care too much about maintaining the current aspect ratio, here's what you can do. This will resize your images to a preferred resolution (in this case, 1280x720), and will potentially crop off the sides or top in the process. To start, make sure you're in the directory with all of your photos.
$ for FILE in `ls *.JPG`; do \
mogrify -resize 1280x720^ -gravity center -crop 1280x720+0+0 +repage -write RESIZED_PHOTO_DIRECTORY/$FILE $FILE; \
In detail, this command -resizes photos to a 1280x720^ resolution (the caret means that the smaller of width and height is maintained and the larger one is kept even if the resolution is larger), and then, by using -gravity and centering, we crop the image to 1280x720 exactly, and write to RESIZED_PHOTO_DIRECTORY/$FILE. Phew, that was a mouthful.
If you just want to resize to a certain height/width and want to maintain the original resolution, just do this:
$ for FILE in `ls *.JPG`; do \
mogrify -resize 600x -write RESIZED_PHOTO_DIRECTORY/$FILE $FILE; \
Maintaining Color Distribution
Note: this step might not be necessary in your situation, but it greatly improved the quality of the final product for me. YMMV.
Sometimes images captured in a time lapse have very different histograms (especially if you have auto-aperture / shutter-speed enabled), and this can make things look "jumpy" from frame to frame. Obviously, this won't look great in your final video, so we're going to normalize the colors to a set distribution.
For an example, just compare the following two images (especially notice the trees, which are much lighter in the first example than the second):
Not ideal, right?
To help achieve this end, I used an ImageMagick script called histmatch, generously provided by Fred Weinhaus (link: http://www.fmwconcepts.com/imagemagick/histmatch/index.php). The idea to use a reference image to generate a histogram that we want all of the other images to match. Once you've decided on your reference image, run the following on every image except the reference image (otherwise the universe will explode).
(I just piped the output of ls *.JPG into a file called normalize.sh and used some of my Vim-fu to do this. Your process might be different.)
Finally, make the darned movie
This is the fun part. Just send the files through to FFmpeg and have it do its magic. If filenames are incrementally named, you'll want to provide the parameters below (like -start_number and the _DSF%04d.JPG format) to make things match up.
This tells FFmpeg to take all of the JPEGs in the directory starting with _DSF and ending with 4 digits, and to output an h.264 video with the yuv420p colorspace to video.mp4. You now have a beautiful timelapse!
It's now been a few hours after The WWDC 2015 Keynote, and I've had some time to digest everything. My immediate impression of everything was a little "meh", but then again, that's sort of how I feel every year. It's hard to satisfy everyone.
Second impression is that the WWDC Keynote is no longer for developers. It's for the end users and wannabe developers. Us old-timers are too jaded to care about this new and shiny stuff, and most of the new products in the keynote aren't even things developers can use (on that note, was it really necessary to spend that much time on Apple Music?).
Another noteworthy thing–no new hardware. I was expecting at least something, so to hear crickets was a little unfortunate. I can only assume that the inordinate amount of time spent talking about Apple Music was in some part due to a need to "fill time" from what would have originally been a 15-20 minute spiel on Apple TV.
As always, the things that are exciting to me happen in the sessions throughout the week. The Keynote is sort of just a preview of what's to come. After some perusing through the documentation, here's what I'm excited about:
I wrote about this in my WWDC 2015 Wishlist, and it came true. You can now link up URLs to be opened by your application. I haven't had too much time to play around with iOS 9 yet, so I'm not sure how this works from either the developer or end-user side of things, but my first impression is that Apple did this right.
Besides the notable backtracking of moving search away from the pull-down gesture and back to the left of the home screen (I would have loved to be a fly on the wall in that meeting), iOS Search can now display search results straight from apps in Spotlight.
Just to illustrate an example—my company writes an app called Tweet Seeker that lets people download their Twitter archives and search their tweets locally on their device. Tweet Seeker can now hook right into iOS and display tweet search results right from Spotlight. Now that's cool!
Remember the iPad Pro? Well, this is it. You take a regular iPad, and you install iOS 9. There. iPad Pro. :claps:
You know those things that you can't really imagine a use for, or want, and then in 3 months you realize you can't live without it? Yeah, well, I'd put money that this is one of those things.
I can see this changing how people use their iPads, along with the new keyboard gestures. The iPad is no longer just a toy, or larger screen iPhone made for watching Netflix and HBO. It's now a tool to get shit done.
I think it's great news that Swift is going open source "later this year". Of course, in Apple parlance, that means probably somewhere around December 15-31 (just trying to be realistic here–it's not easy to open-source something like this, I'd bet there is a ton of proprietary code lurking in that codebase).
Also, I think with this release, I'm comfortable picking up the Swift book and starting to actually learn the language. I am embarrassed (only a bit, though) that I haven't written a single line of code in the language. With this latest release, I think I might be ready to start jumping in. A 2.0 implies a little more stability, and other developers who were waiting on the sidelines will probably jump in on the fun as well.
Another telling thing is that all the code I saw in the "Developers State of the Union" Keynote was written in Swift. I don't think it's any secret now that Apple considers Swift to be the future. As much as we might not want it, it's going to happen.
If you started learning it a few months ago, though, prepare to have to relearn a bunch of stuff. I wouldn't be surprised if Apple pulled the rug out from under you.
I'll keep it brief. I don't need it. Spotify works great for me. I think its success hinges on musicians buying in. We don't want another Ping here.
You'll notice I skipped over a lot of the OS X stuff. I honestly haven't had enough time to digest it all. So. Much. New. Stuff. Will download and report back.
That's it for now. I'm taking a walk to think about this all some more.