This is just to say
I have fixed
the bugs
in the app
and which
you coded
and probably thought
were features
Forgive me
they were egregious
so basic
and so avoidable
I have fixed
the bugs
in the app
and which
you coded
and probably thought
were features
Forgive me
they were egregious
so basic
and so avoidable
In case you were wondering how long you could keep on using macOS 10.14 “Mojave” as a developer targeting any Apple OS, the answer is “not too much longer.” I was presented with the dialog box above when trying to run the beta for Xcode 11.4 on my MacBook running Mojave.
I was doing all this as part of updating The iOS Apprentice, 8th Edition, a great book for people who want to get started building iOS apps. It’s available in both electronic and dead-tree formats, and when you buy an edition, you get updates of that edition for free!
This is old news to iOS/macOS/iPadOS/watchOS developers, but it’s worth repeating. That’s all right; I’d rather code in Swift than Objective-C.
Given the fight between Google and Oracle, I’m certain that if Google was the mother in the comic, Java would be Objective-C and Kotlin would be Swift.
Last night, Anitra and I gave Tampa Bay UX Group’s first presentation of 2020: An overview of the accessibility features in iOS 13, the latest version of Apple’s mobile operating system.
A good crowd — including a handful of people new to the Tampa Bay area — were in attendance at the event, which took place at Kforce, who have a very nice meetup space. I’ll have to talk to them about using their space for Tampa iOS Meetup:
Anitra and I tag-teamed for our presentation. She presented from the ux/ui specialist point of view, while I presented from the programmer/implementer angle:
Here are the slides from our presentation:
We started with a couple of definitions of accessibility:
We then provided a set of personas, around which we based the demos:
Our first demo was of VoiceOver, the gesture-based screen reader. We demonstrated its ability to not only read text on screen, but to facilitate navigation for people who have no or low vision, as well as to describe images — even if no “alt text” is provided. If you’re curious about using VoiceOver, you should check out this quick video guide:
Our second demo was of Voice Control, the new voice command system, which is separate from Siri. It offers an impressive amount of control over your device using only your voice; I was even able to demonstrate playing Wine Crush, a Candy Crush-style app that I wrote from Aspirations Winery, using only my voice. To find out more about Voice Control, see this promotional video from Apple:
We also wanted to show that accessibility can be aided using iOS features that weren’t specifically made for that purpose. We demonstrated this with an app that allows users to click on buttons using a head-tracking user interface based on the face-tracking capability built into Apple’s augmented reality framework:
Whoa @AccordionGuy is clicking buttons with his 👀 ! How cool is that?!?! #a11y #ux #tbux pic.twitter.com/nUEZSyrf8o
— Tampa Bay UX (@TampaBayUX) January 31, 2020
I’ll post of video of this demo in action soon, but if you’d like to try it out for yourself, you can find it on GitHub: it’s the HeadGazeLib project.
We followed these feature demos with a couple of coding examples, where I showed how you can use SwiftUI’s accessibility features to further enhance the accessibility of your apps:
And finally, we closed the presentation with links to the following resources:
We’d like to thank Krissy Scoufis and Beth Galambos for inviting us to present at the Tampa Bay UX Group meetup. They’re a great group that promotes an important — yet often neglected — part of application development, and we’re always happy to take part in their events. We’d also like to thank everyone who attended; you were a great audience with fantastic questions and comments!
You might also want to check out the other presentations we did at Tampa Bay UX Group’s meetups:
Pictured above is Dale Mabry, a “cross the road”-style videogame in the style of Frogger, or its later cousin, Crossy Road. It gets its name from Dale Mabry Highway, a busy north-south six-lane “stroad” in Tampa. I wrote it back in 2016 as part of learning iOS game programming in Swift and SpriteKit.
Here’s a sample of the gameplay:
I lifted the code for moving the player character from the raywenderlich.com book 2D Apple Games by Tutorials. It’s from the Zombie Conga game, pictured below:
I then wrote code to move the cars and handle the gameplay.
After getting the basic gameplay working, I got busy with other projects and forgot about the game for a couple of years. I recently pulled it out of mothballs just before Wednesday’s “Share Your Mobile App” with Others meetup, because organizer Edwin Torres asked attendees to show off any apps they’d worked on.
I wrote it back in 2016, when Swift was at version 3. It took me about a half hours’ worth of work to get it up and running in the current versions of Swift and SpriteKit, which was considerably less time than I thought it would take. I compiled it, put it on my iPad, and showed it to the group at the meetup.
Now that it’s out of mothballs, my plan is to polish it and put it in the App Store later this year. and it’s going to be one of my 20 Projects in 2020.
Want to see the code for the game? You can! It’s posted on my Github.