Apple Photo Smart Collections can be hilarious

June 21st, 2016 No comments

Apple announced a new feature in iOS 10 where your iPhone can create smart collections of your photos in order to surface them to you whenever you want. They’ll show you vacation collections, best of’s, certain dates that seem memorable. They call this feature ‘Memories’.

The collections around locations and people are quite good! The logic here is quite easy, if you spend a short amount of time in a location, or take a lot of pictures there, group them together. Similarly the face and object recognition are pretty good as well and let you easily see a lot of pictures of anyone in your photo collection. I’m quite impressed with

Now where it breaks down is Apple’s ‘Best Of <X>’ collections. X can be last week, last 3 months, last year, perhaps more options I haven’t seen yet. Now the big question is how does Apple select for ‘Best’. Best is a rather arbitrary and personal decision. It seems to me they base it off of how many times you’ve shared that photo in various ways (iMessage, AirDrop, email, perhaps even when it’s selected from a 3rd party photo picker). This metric makes sense, the photos you send out the most were obviously the photos you wanted most visible. It should select for photos that you want shown and reduces the possibility of any photos you took that you regret or want to keep private from showing up in these collections.

Also they will let you press one button to make a movie that selects a small sample of these photos. I think it selects these again based on the highest-shared photos. Which is great for most people! For me… not so much.

I don’t use photos like I imagine Apple imagines most people do. I do take a lot of vacation photos, a couple out at a bar or a restaurant, but I usually get those printed or make a photo book, I don’t email every one of my vacation photos to my family, just a couple while we’re there so they can see what it’s like. I also take photos for when I sell my furniture on craigslist (and don’t delete them) which I usually just AirDrop to my laptop to upload. I took a lot of pictures when house hunting and sent them to various family members to show what we’re seeing and get opinions. I also save memes to my camera roll to share with multiple sets of people. I save gifs to post to various locations. I take pictures of silly things that look ridiculous and send them off to various people. The Hockey community comes up with some of the best images to taunt your friends who root for other teams.

So when I go into one of these collections (Best of 2015 is my favorite) and just hit the play movie button, it’s not the myriad of vacation photos, or pictures out with friends that show up. Some of them do, but the video can only select around 10 or 15 photos. It’s going to choose the ones I sent to everyone! So I get the following (mixed in with one or two vacation photos etc) set to dramatic music:


The Q is mightier than the Sword


This feels like subliminal messaging from Apple at last year’s WWDC Bash


Andrew Shaw doesn’t give a fuck


When is this picture not relevant?


I still love the original picture of Bill on the tracks.


Yes you can Dustin Brown, yes you can.


Does anyone want to buy my coffee table?


One picture from my wife’s med school graduation, but plenty of craigslist photos, memes and other crap.

Based on how it selected these photos, I’m fairly confident Apple is relying on a metric weighted on number-of-shares-per-photo. Especially for the generated movies feature, as that is the one you’re most likely to simply hit play on and look at with people instead of going through and making sure the pictures are in fact what you want. And they must have been tracking shares for a while at least to prepare for this feature as well, as most of these are older pictures.

Also by preparing this blog post I’m pretty sure I weighted those metrics even more by airdropping all these photos to my laptop :).

Categories: Uncategorized Tags:

I have no idea what iCloud Photo Library is Doing

July 3rd, 2015 No comments

This post is my current stream-of-conciousness on iCloud Photo Library as part of the new Photos App in OS X Yosemite. Please excuse grammar/spelling errors. May contain cursing and words not appropriate for those who don’t like bad words


I have no FUCKING idea what in the hell is going on with my iCloud Photo Library, and I hate it.

A few days ago I expanded my iCloud storage space from the measly 20GB I had been paying for before (at a wonderful DISCOUNTED rate of $10.99 a year, instead of the current rate of $2/month… magical) to 200GB. This was due to it being completely full of my device backups. Could I have cleared some? Sure, but the iOS 9 beta seemed to make all my devices ‘new backups’ and I was feeling lazy and I didn’t want my email to run out of space (yay iTools/MobileMe/iCloud email and your backup data being the same storage amount). So I got 200GB. I figured, since I have this space, I may as well use this iCloud Photo Library thing everyone has been talking about. I have a decent number of photos over the years, and I’d like to have them backed up somewhere. I pay for Crashplan, so I technically have the photos backed up that way, but in iCloud it promised to be much more accessible. Yay!

So I enable it, it seems to backup the pictures in my existing iPhoto library, which are mostly images from my iCloud Photo Stream anyway. This works fine, I think, as far as I can tell. Then I get into the fun part!

Over the years I’ve had a variety of computers, and I’ve backed up those computers in different ways. From manually dragging and dropping user folders to external hard drives, to Time Machine, to now Crashplan etc. I’ve also had a variety of iPhoto, Lightroom and Aperture Libraries over the years, from my phone photos to ones taken with different Canon DSLRs. Safe to say it’s a bit of a mess. My plan was to simply grab the folder of originals from each and import them into the new iCloud Photo Library. I didn’t care about my metadata, I didn’t want the images moved, and I figured this would be the easiest process. I thought would be able to remove any duplicates based on exif data.

So I did these transfers, from around 5 or 6 different folders from my Mac Mini and let it finish. I did a couple of other folders from my Macbook Pro today. I was warned about duplicates, as expected, to which I answered “Do Not Import” and “Apply to All”.

  • I have some photos repeated 4 times, some twice.
  • I have some photos missing.
  • I can’t search for some photos by date, but they appear when I scroll around and find them.
  • I have no idea what photos are in my library.

I still have some collections of RAW files from Lightroom that I’m holding off until I can figure out what’s going on. As far as I can see has no way to detect duplicates. I don’t know why search won’t work so I can’t easily establish what photos are missing. And on top of that any operation like an import or while it’s uploading totally thrashes both of my computers to make them almost unusable.


Ugh do I have to use Google Photos?

Categories: Rants, Thoughts Tags:

iOS Pasteboard Security

September 25th, 2014 No comments

There’s been a lot of chatter on the internet lately about the security of webviews embedded in 3rd party applications. Basically that application has full access to what you are typing in to that webview, so if you’re navigated to another website or anything and enter a password, that password could be read.

I think an interesting factor in iOS 8 that enhances security in a more subtle matter are the action extensions. Action extensions allow apps like 1Password/Last Pass etc to perform small actions. In particular for password managers: insert password data into Safari (and other apps that allow it). This allows people to use a variety of passwords and still easily access them for authentication, which is great for security on its own, but there’s more to it than that.

The previous strategy these apps used was you would enter the application, choose the password you wanted and copy it to the pasteboard. You could then paste it into whatever application you needed that password in. This data would have to be in the general pasteboard to be used/shared between apps, and most people simply paste the password and forget what’s in their pasteboard. This also means every app they open afterwards has access to the plaintext version of this password (and a nice shiny identifier that a com.agilebits.onepassword has an existing UIPasteboard as well). At least as far as I can tell.

I start out by going to 1Password and copying a password.

I next compile and run my app, using a simple println/NSLog on the UIPasteBoard in my AppDelegate, and my password is revealed. The code looks like this:
(In Swift for …fun?)

var pasteBoard = UIPasteboard.generalPasteboard()

The result:
"public.utf8-plain-text" = <My_Password_Here_In_Plaintext>;
}, {
"com.agilebits.onepassword" = <Random_Numbers_Here>;

Apparently you have to use the public.utf8-plain-text UTI for your pasteboard data if you want it accessible in Notes/Mail etc, according to Erica Sadun.

Doing some basic filtering on that data to exclude obviously too-long passwords, URLs, etc you could come up with some decent options for passwords to try again later.

I would love to hear if my thoughts here are wrong (perhaps debugging allows for extra access? or something else along those lines).


It seems 1Password has a setting to clear your clipboard between 30 seconds and 3 minutes later, with the default being never. LastPass will let you manually clear it, but doesn’t seem to contain the same auto-clearing option.

Categories: Code, iOS, Security Tags:

UIAutomation Slides

March 19th, 2014 No comments

I gave a talk about UIAutomation to CocoaHeads pasadena this past week on my birthday. I wasn’t feeling too well but I think I managed to get through it ok. A few missteps in the live demo of some of the different ways to run UIA scripts, but so it goes! Anyway, here are my slides from the evening with some comments added-in to them:

Open it in Keynote to read all the comments!

Categories: Uncategorized Tags:

Why the first episode of House of Cards Season 2 was so damn smart

February 22nd, 2014 No comments

Serious spoilers here. Seriously…

Netflix is starting to popularize the available-all-at-once seasons of TV shows with its original programming like House of Cards and Orange is the New Black. There’s an inherent issue with these shows however, where discussing them is difficult when they first come out (and are most likely to be in the popular mindset). With a traditional TV release format (one a week for 24 weeks), there’s always a new episode to watch and talk about at the “water cooler” at work. However, when shows are immediately available to consume at once, different people have different amounts of availability to binge watch a show.
You probably know this already, but it brings me to why the first episode of the second season of House of Cards is so well written.
Read more…

Categories: Thoughts, TV Tags: , ,

Snakes and Ladders, or really just ladders for now

November 5th, 2013 No comments

Spent some time working on ladders! They’re procedurally generated from a seed the same as the level is (and obviously need some tweaking). I’m still working on some of the physics of them. SpriteKit is really nice about setting the category, collision and contact bit masks (which call a delegate method), so it was easy to determine when a player was contacting a ladder.

self.physicsBody.categoryBitMask = featuresCategory;
self.physicsBody.collisionBitMask = 0;
self.physicsBody.contactTestBitMask = playerCategory;

I was hoping to simply set the player’s physics body’s affectedByGravity property to NO, but there’s some weirdness where it seems pre-existing velocity doesn’t go away (so the player will float up/down when the user isn’t pressing any keys). I’ll work on that soon to sort it out, but it’s a bit more fun to explore the caves now! Video below:

Ladders Video

Categories: Code, Game Development, Project Update Tags:

Gold Hunter: Part 2

October 26th, 2013 No comments

Just a quick update here. Added a player to move around the map (he can move sideways and 1 block up as necessary), and he can shoot the ground either down-left or down-right. No fancy animations or anything yet, just playing around with it in my random cave generator. It already brought back memories of planing out how far you’d have to dig out to get to another space in Lode Runner. Although no threats yet to make it interesting. That might be my next plan, is to at least maybe have some random patrolling creatures or two.

I also discovered that you can’t update the image of a SpriteNode in SpriteKit on the fly, you have to remove it and add a new one. I believe you can apply animations to cycle through images and such, but simply changing the image doesn’t work.  You can change the Texture Atlas and coordinates though I think, but I haven’t yet set those up.

First Movement Demo


Note: I’m just posting these updates on the fly. If I spend too long on them I’ll probably get bored and give up, so apologies in advance for grammatical errors!

Categories: Game Development, Games, Project Update Tags:

Gold Hunter: Part 1

October 26th, 2013 No comments

Figured I’d keep a mini blog going on a project I’m working on (because it’s worked so well before) to do a game inspired by Lode Runner: The Legend Returns (later, Lode Runner Online), and reposting it on I really liked the no direct-combat style of Lode Runner for planning and running around a level and gathering loot, and I’d like to bring a faster level of twitch gameplay to the old school mechanics, as well as a smaller field of view (rather than exposing the entire map to the player) and maybe play around with ideas of lighting/physics to the game. As you can probably tell, I’ve played a decent amount of the new Spelunky lately, and it’s inspired me to make a similar game.

My first step this week was to mess around with a random level generator. I’m using SpriteKit (technically a KoboldKit wrapper around it, not sure what advantages KoboldKit has for me just yet.

I started with Big Bad Waffle’s, because it required no prep work. I just implemented it in Objective-C (using Objective-C NSMutableArrays, probably not the best for performance reasons so that will probably switch to pure C later). It created some pretty good (importantly: re-producible) results based on a seed value I initially drew it using SKSpriteNodes of different colors, but then figured I’d steal the ever-useful Cute sprites available to make a quick map and off it went! I did a little shift in the y direction to make the ground appear a little bit lower just for fun visual effect.

Note: Images used are just for a more interesting rendering of the map, not a decision on style.

map1 map2 map3


I’ve also been reading into how Spelunky does their code generation and it’s pretty interesting. Apparently a simple variety of the Drunken Walk algorithm (or something along those lines). More details here from a guy who ported Spelunky to the web and posted his findings. It’s great to create a guaranteed procedural path to an exit. It required a little more setup as you need “rooms” with exits on different sides (and a decent assortment of them as well) in order to get going, but it looks pretty simple and I might give it a try later!

Next up: Making a player able to walk around and blow up walls! Still deciding on top-down, angle, or side view for this game.

Categories: Game Development, Games, Project Update Tags:

iOS 7 and the iPad

June 19th, 2013 No comments

So, iOS 7 features a brand new “light” interface, with thin fonts to take “full advantage” of the retina displays on new iOS devices. Now, whether or not you like it or agree with the decision (personally I wasn’t a fan to start, but I’m really starting to like iOS 7 now!), there’s one little snag incoming.

Apple didn’t release an iPad version of the iOS 7 beta yet, and I think I might know part of the reason why. There were rumors that iOS 7 was slightly behind schedule, so obviously they focused on the iPhone to get a good announcement and initial beta going. However, I think the iPad mini is an issue that I’m interested to see how they will address. It’s the only non-retina device that supports iOS 7, and since it came out so recently it has to be supported for a couple versions at least. I’m assuming a retina iPad mini will be coming out soon, but even so, how will it work on the iPad mini? I just got mine and don’t want to give it up anytime soon (it’s not so cheap to easily replace 🙁 ). Will the text be legible? I guess they’ll probably just make the font a bit thicker, but I’m curious if they will compromise on their vision of what iOS 7 should look like.

Just a random stream of consciousness post.


@joshhinman brings up a good point that the iPad 2 is also supported. That’s even less pixel dense than the iPad mini, so we’ll see how iOS 7 works on those devices. In any case, I think that Apple can drop the iPad 2 in iOS 8 without too much uproar while the mini should stick around a bit longer. I still wonder what they have planned for non-retina screens.

Categories: General, iOS, Thoughts Tags: , needs something different

April 20th, 2013 No comments

I’ve been playing around with I’m not sure how I feel about it just yet. It feels like a twitter clone and every app is basically a twitter clone. Someone needs to make something cool with, a “killer app” as the tech news people like to say, to make people realize what it’s for. What that is I can’t claim to know. People will only pay $100 a year for a little while with 0 return. There needs to be something to draw people to the platform, something so cool that people will run out of invites. And then those people will discover and use the other products.


Is this obvious? Probably.

Categories: Thoughts Tags: