Learn slowly to develop quickly.

Slow down

I’ve seen something happen many times during my longish tenure as a software developer being amidst to some others who are new to it. They seem to have a complete lack of wanting to understand something before tearing into it.

When someone comes to mobile development, and all they have under their belt is some solid web development using JavaScript/CSS/HTML, and maybe some PHP – they are used to manipulating a page’s DOM. And that’s cool. You can do a whole host of interesting things with that understanding and experience. However, they lack many basic concepts that are nearly required for mobile development. These aren’t trivial things you can get around knowing – they form the basis… the solid foundation of your endeavors moving forward.

I have experienced this in the past and I see it at times at work. Instead of owning up to some facts about a lack of understanding – they want to pump out some code and show everyone they are wizards. They got it.

Only wanting to break the finish line tape…

Then they ask for some working source code without worrying about how it actually works. They try to plug it into what they already cobbled together and have a hard time working with it – getting it to work. They don’t know about scope. They don’t know about classes, extensions, or how to best wire up various aspects of their UI.

They ask for completed code – but don’t typically search StackOverflow or even Google things. If they do, they don’t worry about what makes it work. In the event they do get something working, if they are asked to make tweaks or changes, they are screwed. Why? They don’t know how it works – or know enough to make the change easily.

They fight with code and concepts. They want to get to the finish line in whatever they are doing – by skipping the race. In the end, they may eventually get to what they are after. But they aren’t putting the work in, the investigation, the basics of design and development, and in the end, they are only slowing down their development.

Being a developer shouldn’t be about ego. If you don’t know something, don’t be embarrassed to ask teammates. Don’t downplay your struggles.

If you don’t understand something, take the time to learn about it. Create test projects or Swift Playgrounds and bang some code. Read articles and tutorials about making things. Try them for yourself. Once you have a solid foundation, you’ll be able to more easily approach challenges and deliver projects that you understand.

Use comments in your code – and place questions, TODO or FIXME, and MARKs there. This is especially handy if you’re working on something in source control with other team members. They have a chance to see your stuff and be able to offer up help. Don’t rely on this though.

//I don't understand this, can someone elaborate?
//TODO: Why is this not working? Need some help.
//FIXME: This isn't working, how come?
//MARK: — Section Marker

When you do this, you’ll see those called out in the source file you’re currently working in (except for comments – but still useful).

Explanation

Development shouldn’t be about wowing your co-workers. It’s about delivering great experiences in a timely manner, with the ability to iterate quickly along the way. Communicate your ideas and approaches. If you’re calling on others to help you bang things out – that’s good. But not in lieu of understanding what they are doing and why they are doing it.

If you are relying on others to do some pretty conceptually basic things over and over, they will become less likely to assist you in the future.

There is no escaping time and effort. They are no substitutes for trying things, learning things and failing at them. If you learn slowly, you’ll forget slowly. Spend time outside of projects to learn core concepts. Code, code, and code some more. Look at applications you like and think about how they might be composed (backend and frontend) – and then try to build that part yourself as a prototype.

Start simple… tables, collection views, loading assets, playing sounds, using gestural input, etc. Get those things nailed down. Then subclass some UI to do something special. Learn about delegation, Classes, extensions, etc. Grow slowly. Build upon past prototypes. You’ll have something to lean on moving forward. Every day will get better and better because of your efforts and you taking the time to learn how to develop for the platform.

Ask questions.

Read.

You’ll be much happier in the end. You’ll know more. You’ll be able to produce more. You’ll be able to iterate quicker. You’ll be a better teammate.

Verifying assets with the iTunes Store… solved

Verifying

Your mileage may vary with this post, but in my case, I was chasing my tail for some time while finally coming to a solution.

At work, I generally author all kinds of prototypes. Sometimes I use Xcode and author iOS applications with interesting levels of interaction and communication. In these cases, I often need to share my work with a designer who lives several time zones away from me. Instead of him installing Xcode and building to his own devices while utilizing source control, I can swing a binary his way using TestFlight. This works pretty well. Until recently.

I would archive my project and attempt to upload it to App Store Connect so I could assign it to the designer and they would be able to install directly from TestFlight on their device.

This time the upload stuck at “Verifying assets with the iTunes Store…”

Googling for answers, I came across a lot of chatter about getting off a corporate network because of potential port blockage that would allow the upload to complete. That did not work for me. I tethered to my iPhone X, public Wi-Fi outside my work LAN, etc. Those did not work. My upload remained stalled at the same place each and every time. I restarted my laptop. No difference.

I then did the following which actually told me what was going on:

  1. Create an Archive in Xcode of my project.
  2. Export the archive to my desktop instead of uploading to the iTunes Store. Saved to my desktop.
  3. From Xcode, I launched the Application Loader. Selected the .ipa generated in step 2 above.
  4. After getting stuck during the same activity, AL actually gave me a bunch of error feedback. I forgot to apply artwork for all of my application’s icons. Oops. Without using AL, I never would have known this.
  5. I assigned all the icon images in the project – then tried to upload a new archive from Xcode.
  6. It was stuck yet again. No idea why.
  7. I took that new archive, exported again to my desktop. Used AL to upload the newly created .ipa to the iTunes Store.
  8. This worked quickly and just fine!

I had become so used to authoring my own prototypes (apps) without worrying about icons for TestFlight that I simply didn’t include them. I spent a few hours trying to solve the problem. It seems there is still some sort of thing preventing me from uploading an archive directly through Xcode – I still needed to use Application Loader. However, I was able to get the binary uploaded, and after about 10 minutes it was processed and ready for TestFlight activity.

Apple AirPods vs Bose QuietControl 30

Airpods vs QC30

This Christmas I was gifted a nice pair of Apple Airpods. They were extremely easy to set up and use because of that Apple W1 chip they house. iOS is tailored for their integration, and they do not disappoint in regards to usability. I wish that I had some more options for left and right tap control, but I can make do with one of the options being Siri to access other functions without having to dig my phone out of my pocket. They sound decent, their charge is enough for my typical use, the case is rather nice, and while using my phone, it’s a match made in Apple-centric heaven.

I brought them to work today and paired them with my MacBook Pro. The connection was spotty. The Cloud properly had them already on my Bluetooth pairing list, but with an older name that I had given them. They sputtered quite a lot. I then used them with my phone again – and the experience was flawless. I went back to the MacBook Pro – spotty again. I do use a wireless keyboard and Magic Mouse with the laptop. I have no idea if this is causing some network traffic impeding the smooth delivery of audio or not.

I am currently using my Bose QC30s with my laptop – always smooth, always on the money. I don’t feel them, just like I don’t feel the Airpods. Of course, they have more charge to deliver a much longer audio experience. They work fantastically with my phone too. They are not pocketable. They do connect just about as fast, I just cycle through the device list on the QC30s until I get the laptop or phone included.

A conundrum. I am going to try the Airpods again with the laptop because the Siri control is quite nice in getting to what I want to hear without any fuss.

Update: While writing this post, I again put the Airpods in, got the chime that they are powered on. I then selected them from the laptop BT menu on my laptop. The same chime, this time for a successful Bluetooth connection. And now Solar Fields is playing through iTunes to the Airpods. There was exactly no sputtering this time. Nice. A noticeable difference in audio quality, but for now it’s alright. If someone approaches, I can simply double tap the back of my ear for play/pause and have a conversation. Repeat and I’m back into the music. I could remove a bud too, but there isn’t a pressing need to do that. It’s nice as an option as more of a social cue to the person I am speaking with (seems less rude). 

My left bud has Siri, I try that and nothing happens – probably because I am not paired with my phone. My Mac has Siri, I can only assume that might get enabled in a future OS X update. I emailed Craig about this to see if he responds in a positive way. I know I can activate Siri on OS X directly, but doing it using my Airpods would be even nicer.

I don’t have volume control either, but since I am on my Mac, that’s super-easy to do with the keyboard. 

Nits about using QC30s:

  • I can hear myself breathe at times when noise cancellation is up and the music is low or atmospheric.
  • They aren’t pocketable – I park them around my neck.
  • I have to remove a bud to speak to someone typically.
  • I still have wires, SoundSport Free would be a better comparison.

Nits about Airpods:

  • I can’t control volume directly, I use Siri to do it.
  • At times I get stuttering from my laptop – phone is fine.
  • Inferior sound to the QC30s. Not including anything about noise cancellation – here it’s good not to have it.

Typically I would say go for the QC30s, but using Airpods does have advantages, especially when you’re using your iPhone. If that is the case, and you want to use Siri, take calls, etc. there isn’t a debate – Airpods have more functionality at your disposal. They will never sound as good, but it’s a better general tool if you’re navigating the trails of an Apple iOS ecosystem only.

Monitoring the closest 20 regions – how?

iOS allows each application to monitor up to 20 regions. That seems like a lot, especially when you figure that the regions are registered and can be triggered (entry and exit) even while your app is backgrounded or your phone is on standby. So what do you do when you have more than 20? iOS will simply ignore them if added to your locationManager.

You can monitor the 20 closest (most likely) to be encountered. Why bother registering for regions that are far away? So in theory upon significant user location change, triggering a region, etc.

  • stop monitoring for all regions that may exist for your location manager
  • find the 20 closest regions from the user’s location
  • register for those 20

Rinse and repeat based on your own logic (as mentioned before as significant user location update, the addition or subtraction of a region, etc.)

For me personally, I wanted to figure out how to do this myself. I know I needed to perform a sort based on distances.

I had an array of CLCircleRegions.

var monitoredRegions:[CLCircularRegion] = []

I pushed into that when placing my pin annotations on the map. You can’t sort them though. Hmm. I needed something with a property I could sort. So I created a struct with a region and distance.

struct SortableRegion {
    var distance: Double
    var region: CLCircularRegion
}

Cool. Getting closer. We can sort on that distance property. Here is a function that basically does what I need it to do. I’ll walk through it after showing it to you.

func checkAndMonitorTwentyClosestRegions()
{
    stopMonitoringAllRegions()
        
    if monitoredRegions.count > 20
    {
        print("we need to only monitor the 20 closest regions. Call this anytime regions are added, removed, or the user moves significantly.")
            
        var sortableRegions:[SortableRegion] = []            
        let location: CLLocation = CLLocation(latitude: myMapView.userLocation.coordinate.latitude, longitude: myMapView.userLocation.coordinate.longitude)
                
        for region in self.monitoredRegions
        {
            let fenceLocation = CLLocation(latitude: region.center.latitude, longitude: region.center.longitude)
            let distance = location.distance(from: fenceLocation)
                    
            let sortRegion = SortableRegion(distance: distance, region: region)
            sortableRegions.append(sortRegion)
        }
                
        let sortedArray = sortableRegions.sorted {
            $0.distance < $1.distance
        }
                
        // Grab the first 20 closest and monitor those.
                
        let firstTwentyRegions = sortedArray[0..<20]
        for item in firstTwentyRegions {
            let region = item.region
                locationManager.startMonitoring(for: region)
            }
        }        
    } else {
        print("Less than 20 regions, monitor them all.")
        for region in monitoredRegions {
            locationManager.startMonitoring(for: region)
        }
    }
}

Right off the bat, I call a method that stops monitoring for all regions a location manager might have. Basically just:

for monitored in locationManager.monitoredRegions {
    locationManager.stopMonitoring(for: monitored)
}

Now I check how many regions there are. If there are less than 20, we register them all. Easy. If there are more than 20, we need to find the nearest 20 of them and start monitoring those.

I created an empty array typed to SortableRegion (that struct I made). I then grabbed a reference to the user’s location. We want to use that to determine the distance to the center of every region. I create an instance of SortableRegion and append that to the array. For each region. So each instance now has reference to the CLCircleRegion as well as the distance. Perfect. We then sort the whole thing when we’re done based on distance, lowest to highest.

Then we make an instance of the sorted array (using a range) – grabbing just the first 20 items. Pretty nifty. We loop through that array and using the reference to CLCircleRegion, monitor for each one. When you want to check again, make sure the monitoredRegions is properly updated first, and then call the checkAndMonitorTwentyClosestRegions function and you should be all set.

The tvOS long press

Random

If you’d like to detect a long press on a UIButton or just for a view that has no buttons – it’s pretty easy. There is one simple gotcha, however. You’ll trigger your selector twice. Once when detected (down on the glass pad Apple TV remote), and again on the pad’s release.

So you’ll need to set a BOOL for knowing what the state is. Easy enough.

fileprivate var longDown: Bool = false

override func viewDidLoad()
{
    super.viewDidLoad()
    let longPressGestureRecognizer = UILongPressGestureRecognizer(target: self, 
    action: #selector(longPress(longPressGestureRecognizer:)))
    view.addGestureRecognizer(longPressGestureRecognizer)
}

func longPress(longPressGestureRecognizer : UILongPressGestureRecognizer)
{
    // I fire on the down and also on the release.
        
    if longDown == false {
        longDown = true
        print("long press down.")
    } else {
        longDown = false
        print("long press up.")
    }
}

I would not have thought that the gesture would be triggered twice on hardware, but it does.

Tagged :

WWDC 17 Packing List

WWDC17

It’s almost time for WWDC17, this year in San Jose, CA instead of San Francisco. I haven’t been in several years because my lottery skills are obviously lacking. I managed to score one this year and I’m pretty excited. What will I take this year?

Well, to be honest, taking a laptop to sessions in the past, for me, has been serious overkill. I might jot some notes down in a Moleskine or other notebook – for transcription later. Taking notes on a laptop brings distraction to the game for me. Am I plugged in and charging? I need to get close to a plug. Oh, I can check Facebook during this session. Email. Pretend to be coding something amazing in Xcode. Taking notes on a laptop makes me feel like a court stenographer and I want to relax a bit and take the information in. I can always watch the session again later in its stored video format (to catch some details that might have gone by too quickly).

I will still take my trusty MacBook Pro R, but that will be for when I’m back in my room at night – coding, emailing, sorting photos, watching session videos again, etc. It will probably stay in my room locked up. Charged and ready for my welcome return at the close of each day. That will save me a bunch of weight to lug around too. I can use my laptop bag for other things, including hoofing multiple Odwalla juices around (if they supply those big cans of it around the venue).

For travel:

  • Bose QuietControl 30
  • Bose QuietComfort 35
  • Casio ProTrek PRW 3100y-1B (all black) watch

For the sessions:

  • iPhone 6 Plus. The most important bit of gear I have.
  • At least two portable battery chargers. Anker Astro Pro Series(20000mAH) and a RavPower (16750mAH)one I had before the Anker. Both charged up and ready to fly. The Anker needs plugged into an outlet, the RavPower does microUSB for charging.
  • Two point-and-shoot cameras – both Sony. An older one which is my favorite, and a newer model which I don’t like as much.
  • Timbuk2 messenger laptop bag.
  • Moleskine, fountain pen, EDC pens, etc.
  • Zojirushi Coffee Thermos 20oz. Über awesome and über important.
  • Various cables, USB wall plugs, etc.
  • Takeya ThermoFlask Insulated Stainless Steel Water Bottle, 40 oz, Asphalt. Huge. Perfect. Might leave at the hotel as it could be overkill.
  • Apple Watch series 2. I’m hoping Apple updates the WWDC iOS app and it gets watch support – for keeping us on our selected schedules, getting important updates, map support, etc.
  • Sunglasses. For me, very important.

Hotel:

  • Takeya ThermoFlask Insulated Stainless Steel Water Bottle, 40 oz, Asphalt. Huge. Perfect. No water bottles.
  • Couple of polos/shirts (a few older WWDC ones, a Swift logo one, etc.)
  • Shorts/jeans/Scarpa shoes. Scarpa Margherita rock.
  • A spring jacket in case it gets cool at night (Patagonia fold up for laptop bag) – not wearing a WWDC17 one if Apple swags those out.
  • Livionex tooth gel. Its amazing.
  • Wet shaving and beard gear. Very important to clean up daily.
  • Packing cubes. Although I’m not bringing a ton of clothes, they really, really, really help keep things tidy. I know I’ll be bringing back a few additional shirts and a sweatshirt at least.
  • Snooz white noise generator (I got from Kickstarter)
Tagged : / / /

Getting hour offsets for local time versus a target time zone in Swift

Swift

I am currently working on something that will display time in different time zones. However, you need to compare the user’s current time versus those at the target time zones to be accurate. This is what I figured out.

It’s easy to get the user’s local time.

let cal = Calendar(identifier: .gregorian)
let date = Date()
let hour = cal.component(.hour, from: date)
let minute = cal.component(.minute, from: date)
let second = cal.component(.second, from: date)
print("\nLOCAL TIME: \(hour):\(minute):\(second)\n")

Now, you need to get the time in other time zone targets. Here is an example for London, England. A continuation from the code above.

let locale = TimeZone.init(identifier: "Europe/London")
let comp = cal.dateComponents(in: locale!, from: date)
print("LONDON: \(String(describing: comp.hour!)):\(String(describing: comp.minute!)):\(String(describing: comp.second!)), day: \(comp.day!)")

Excellent. Now, compute the difference between the user’s local time and the time in London.

// I am in Boston currently.
print("London offset (modifier): \(comp.hour! - hour)") // 5

And there you have it. The difference is 5 hours ahead (it’s a positive value). If I were in London or that time zone, the difference would be 0. For someone like me who rarely dabbles in date work, this was a bit of discovered magic. For others, this post must be quite pedestrian and a waste of time. I apologize. But if I found a post like this earlier, I could have saved myself some time.

Tagged : /

tvOS UIFocusGuide demystified

A post about tvOS UIFocus enlightenment and a helper Class that I use to help debug my user interface work.

UIFocus

Above you’ll notice four buttons (a screenshot from my actual Apple TV). You’ll also notice a total of eight purple boxes with dashed outlines. Each is labeled in all capitals. Those are instances of my helper Class (“FocusGuideRepresentation”). You see, UIFocusGuides do not have any visual interface. So when you are deploying them, you’re essentially working in the dark. This helper Class shows you right where the guides are to help you visually lay everything out. Of course, when you get into dynamic situations where buttons are shuffling around, this can help you even more.

The focus management for tvOS works incredibly well when your buttons line up vertically or horizontally. It just works, you don’t need to do anything for that functionality. When they aren’t aligned, you can get into situations where buttons aren’t available through normal navigation. Above, without any UIFocusGuides, a user could move from button One to Two. And back up. That would be it. Buttons Three and Four would be hung out to dry without the user able to navigate to them. That’s why UIFocusGuides exist.

You can think of them as invisible buttons that pass focus to an actual button – based on rules that you provide.

I decided that in addition to being to move up and down with the Siri remote to access the buttons, left and right should also work at the top. A user swiping right from One to get to Three should work. That means 8 guides, as you can see in the diagram. The dashed rule lines show how each guide passes focus. That is a lot of guides, but in the end, the navigation ends up being buttery and simple to use. An application should be a joy to use.

Below is the code for the helper Class, followed by the full code for what you see in the image. Try it out on an Apple TV and see what’s going on and experience how nice it feels getting around.

import UIKit

class FocusGuideRepresentation: UIView {

    init(frameSize: CGRect, label: String)
    {
        super.init(frame: frameSize)
        self.backgroundColor = UIColor.blue.withAlphaComponent(0.1)
        let myLabel = UILabel(frame: CGRect(x: 0, y: 0, width: self.frame.width, height: self.frame.height))
        myLabel.font = UIFont.systemFont(ofSize: 20)
        myLabel.textColor = UIColor.white.withAlphaComponent(0.5)
        myLabel.textAlignment = .center
        myLabel.text = label.uppercased()
        self.addSubview(myLabel)
        
        // Add a dashed rule around myself.
        
        let border = CAShapeLayer()
        border.strokeColor = UIColor.white.withAlphaComponent(0.4).cgColor
        border.fillColor = nil
        border.lineWidth = 1
        border.lineDashPattern = [4, 4]
        border.path = UIBezierPath(rect: self.bounds).cgPath
        border.frame = self.bounds
        self.layer.addSublayer(border)
    }
    
    required init?(coder aDecoder: NSCoder) { fatalError("init(coder:) has not been implemented") }
}

And, now, the main code for the ViewController. Note: the 4 UIButtons are in the Storyboard and hooked up (as you see the @IBOutlets).

import UIKit

class ViewController: UIViewController {

    @IBOutlet var one:   UIButton!
    @IBOutlet var two:   UIButton!
    @IBOutlet var three: UIButton!
    @IBOutlet var four:  UIButton!

    var fg1: FocusGuideRepresentation!
    var fg2: FocusGuideRepresentation!
    var fg3: FocusGuideRepresentation!
    var fg4: FocusGuideRepresentation!
    var fg5: FocusGuideRepresentation!
    var fg6: FocusGuideRepresentation!
    var fg7: FocusGuideRepresentation!
    var fg8: FocusGuideRepresentation!
    
    override func viewDidLoad() {
        super.viewDidLoad()
        setUpFocusGuides()
        
        // Pop these back up above the guide representations.
        
        self.view.addSubview(one)
        self.view.addSubview(two)
        self.view.addSubview(three)
        self.view.addSubview(four)
    }

    func setUpFocusGuides()
    {
        let firstFocusGuide = UIFocusGuide()
        view.addLayoutGuide(firstFocusGuide)
        firstFocusGuide.leftAnchor.constraint(equalTo:   one.leftAnchor).isActive =    true
        firstFocusGuide.topAnchor.constraint(equalTo:    one.bottomAnchor).isActive =  true
        firstFocusGuide.heightAnchor.constraint(equalTo: one.heightAnchor).isActive =  true
        firstFocusGuide.widthAnchor.constraint(equalTo:  three.widthAnchor).isActive = true
        firstFocusGuide.preferredFocusEnvironments = [three]
        
        let secondFocusGuide = UIFocusGuide()
        view.addLayoutGuide(secondFocusGuide)
        secondFocusGuide.rightAnchor.constraint(equalTo:  three.rightAnchor).isActive =  true
        secondFocusGuide.bottomAnchor.constraint(equalTo: three.topAnchor).isActive =    true
        secondFocusGuide.heightAnchor.constraint(equalTo: three.heightAnchor).isActive = true
        secondFocusGuide.widthAnchor.constraint(equalTo:  three.widthAnchor).isActive =  true
        secondFocusGuide.preferredFocusEnvironments = [one]
        
        let thirdFocusGuide = UIFocusGuide()
        view.addLayoutGuide(thirdFocusGuide)
        thirdFocusGuide.leftAnchor.constraint(equalTo:   two.leftAnchor).isActive =   true
        thirdFocusGuide.bottomAnchor.constraint(equalTo: two.topAnchor).isActive =    true
        thirdFocusGuide.heightAnchor.constraint(equalTo: two.heightAnchor).isActive = true
        thirdFocusGuide.widthAnchor.constraint(equalTo:  four.widthAnchor).isActive = true
        thirdFocusGuide.preferredFocusEnvironments = [four]

        let fourthFocusGuide = UIFocusGuide()
        view.addLayoutGuide(fourthFocusGuide)
        fourthFocusGuide.leftAnchor.constraint(equalTo:   four.leftAnchor).isActive =   true
        fourthFocusGuide.topAnchor.constraint(equalTo:    four.bottomAnchor).isActive = true
        //fourthFocusGuide.bottomAnchor.constraint(equalTo: two.bottomAnchor).isActive =  true
        fourthFocusGuide.heightAnchor.constraint(equalTo: four.heightAnchor).isActive = true
        fourthFocusGuide.widthAnchor.constraint(equalTo:  four.widthAnchor).isActive =  true
        fourthFocusGuide.preferredFocusEnvironments = [two]
        
        let fifthFocusGuide = UIFocusGuide()
        view.addLayoutGuide(fifthFocusGuide)
        fifthFocusGuide.leftAnchor.constraint(equalTo:   three.leftAnchor).isActive =   true
        fifthFocusGuide.bottomAnchor.constraint(equalTo: one.bottomAnchor).isActive =  true
        fifthFocusGuide.heightAnchor.constraint(equalTo: three.heightAnchor).isActive = true
        fifthFocusGuide.widthAnchor.constraint(equalTo:  three.widthAnchor).isActive =  true
        fifthFocusGuide.preferredFocusEnvironments = [three]
        
        let sixthFocusGuide = UIFocusGuide()
        view.addLayoutGuide(sixthFocusGuide)
        sixthFocusGuide.leftAnchor.constraint(equalTo:   one.leftAnchor).isActive =   true
        sixthFocusGuide.bottomAnchor.constraint(equalTo: three.bottomAnchor).isActive =  true
        sixthFocusGuide.heightAnchor.constraint(equalTo: three.heightAnchor).isActive = true
        sixthFocusGuide.widthAnchor.constraint(equalTo:  three.widthAnchor).isActive =  true
        sixthFocusGuide.preferredFocusEnvironments = [one]
        
        let seventhFocusGuide = UIFocusGuide()
        view.addLayoutGuide(seventhFocusGuide)
        seventhFocusGuide.leftAnchor.constraint(equalTo:   four.leftAnchor).isActive =   true
        seventhFocusGuide.bottomAnchor.constraint(equalTo: two.bottomAnchor).isActive =  true
        seventhFocusGuide.heightAnchor.constraint(equalTo: four.heightAnchor).isActive = true
        seventhFocusGuide.widthAnchor.constraint(equalTo:  four.widthAnchor).isActive =  true
        seventhFocusGuide.preferredFocusEnvironments = [four]
        
        let eighthFocusGuide = UIFocusGuide()
        view.addLayoutGuide(eighthFocusGuide)
        eighthFocusGuide.leftAnchor.constraint(equalTo:   two.leftAnchor).isActive =   true
        eighthFocusGuide.bottomAnchor.constraint(equalTo: four.bottomAnchor).isActive =  true
        eighthFocusGuide.heightAnchor.constraint(equalTo: four.heightAnchor).isActive = true
        eighthFocusGuide.widthAnchor.constraint(equalTo:  four.widthAnchor).isActive =  true
        eighthFocusGuide.preferredFocusEnvironments = [two]
        
        // To aid in debug placement.
        
        fg1 = FocusGuideRepresentation(frameSize: firstFocusGuide.layoutFrame, label:  "first")
        fg2 = FocusGuideRepresentation(frameSize: secondFocusGuide.layoutFrame, label: "second")
        fg3 = FocusGuideRepresentation(frameSize: thirdFocusGuide.layoutFrame, label:  "third")
        fg4 = FocusGuideRepresentation(frameSize: fourthFocusGuide.layoutFrame, label: "fourth")
        fg5 = FocusGuideRepresentation(frameSize: fifthFocusGuide.layoutFrame, label: "fifth")
        fg6 = FocusGuideRepresentation(frameSize: sixthFocusGuide.layoutFrame, label: "sixth")
        fg7 = FocusGuideRepresentation(frameSize: seventhFocusGuide.layoutFrame, label: "seventh")
        fg8 = FocusGuideRepresentation(frameSize: eighthFocusGuide.layoutFrame, label: "eighth")
        
        self.view.addSubview(fg1)
        self.view.addSubview(fg2)
        self.view.addSubview(fg3)
        self.view.addSubview(fg4)
        self.view.addSubview(fg5)
        self.view.addSubview(fg6)
        self.view.addSubview(fg7)
        self.view.addSubview(fg8)
    }
    
    override func didReceiveMemoryWarning() {
        super.didReceiveMemoryWarning()
    }
}

 

Tagged : / /

Creating a tvOS parallax UIButton

Image Stack

If you’ve been involved in tvOS application development, or you’re new to the whole process, you might find this post interesting in regards to user interface.

I have recently been involved in tvOS dabbling. I’ve been creating a custom application (no TVML) and I wanted one of those nifty parallax-style buttons that Apple uses in its media browsing (notably the TV application). You can focus an item and it will shift neatly as you move around on the Siri remote control. Perfect if you don’t need to include explanatory text. However creating one eluded me for a short while. There may be a better way to handle this, but I present to you my working solution.

TV Stack

Create an Apple TV Image Stack

You need to create a new Apple TV Image Stack in Xcode (when you have your Assets.xcassets folder selected).

The stack takes 3 files by default. I used a background white PNG, a shadow PNG, and a product PNG. If you don’t supply a solid background image, you’ll see the shadow behind your UIButton – which doesn’t look good. Drag in your images into each layer (Front, Middle, Back). You’ll get a nice preview of the parallax at the top so you can see how your images work together. This allows for you to determine if you need to make changes to get the look that you want. I’ve shown each image at the top of this post, each box representing one of them. I named mine “qc35Stack” – which you’ll need to refer to later as an image name.

stacks

Create your custom UIButton with parallax

Now that you have your image stack, you can use it to supply images for a UIButton’s various states. When you supply a stack in this way, the button will know that it should perform a parallax presentation. I created the 3 images at the same size that I want the buttons to be (i.e. the image sizes match the UIButton’s declared frame).

Here is the code I settled on.

let buttonUnpressedTexture = UIImage(named: "qc35Stack")
let buttonPressedTexture = UIImage(named: "qc35Stack")
let newButton = UIButton(type: .custom)
newButton.frame = CGRect(x: 800, y: 200, width: 180, height: 180)
newButton.setImage(buttonUnpressedTexture, for: UIControlState.normal)
newButton.setImage(buttonPressedTexture, for: UIControlState.highlighted)
newButton.setImage(UIImage(named: "qc35Stack"), for: UIControlState.focused)
newButton.imageView?.clipsToBounds = false
newButton.imageView?.adjustsImageWhenAncestorFocused = true
newButton.adjustsImageWhenHighlighted = true

Tada

Finished.

A note: I first attempted to use static PNG images for the highlighted and normal states of the UIButton, but the transitions between them and the focused were glitchy and didn’t look good. When I used the same stack for each of those states, things look good. I don’t know if this is the correct way to do it or not, but it is working.

Go ahead and try it out for yourself. It seems to work a treat, and you didn’t need to subclass a UIButton in your own Class to get it working either.

A short post because the solution is pretty simple.

Richer text for UILabels

Swift

This post is silly simple, but in the past, I remember doing things like this using Ranges.  You have a 2-line UILabel and you want a bold font for the first line, and then regular for the second. In Swift, this is quite simple and straight-forward. This is all the code you’d need to pull it off easily.

let style = NSMutableParagraphStyle()
style.lineSpacing = 5
let attString = NSMutableAttributedString(string: "I am the first line.\n",
    attributes: [NSFontAttributeName: UIFont(name:"Gotham-Bold", size:18.0)!,
    NSParagraphStyleAttributeName: style])
attString.append(NSMutableAttributedString(string: "I am the second line.",
    attributes: [NSFontAttributeName: UIFont(name:"Gotham-Book", size:18.0)!,
    NSParagraphStyleAttributeName: style]))
label.attributedText = attString

 

Returning data from an async network operation in Swift

Swift is cool

If you’re ever using asynchronous network operations (say GET or POST) and want to return data when calling a method, you’ll quickly understand that it’s not so easy. But you’ll see below how you can do this fairly easily.

Let’s say you have a Class that handles all kinds of network communication. I call mine an “adapter” for lack of a better name. I can call methods on the Class and get data returned from it. You’re never going to be sure when the data returned is available, so you need to set things up with a block. Here is an example method in an adapter Class that I have. I included the Struct so you know how it’s set up. I’m also using AEXML to parse the returned XML to make things easier for me as a developer.

struct SpeakerInformation {
    var deviceID: String
    var name: String
    var type: String
}

func getSpeakerInformation(callback:@escaping (SpeakerInformation) -> () )
{
    var request = URLRequest(url: URL(string: "http://\(ipAddress):\(port)/info")!)
    request.httpMethod = "GET"
    let session = URLSession.shared
    session.dataTask(with: request) { data, response, err in
            
        var options = AEXMLOptions()
        options.parserSettings.shouldProcessNamespaces = false
        options.parserSettings.shouldReportNamespacePrefixes = false
        options.parserSettings.shouldResolveExternalEntities = false
                        
        do {
            let xmlDoc = try AEXMLDocument(xml: data!, options: options)
                
            if let name = xmlDoc.root["name"].value {
                    
                let thisName = name
                let thisDeviceID = xmlDoc.root.attributes["deviceID"]
                let thisType = xmlDoc.root["type"].value!
                let thisInfoPacket = SpeakerInformation(deviceID: thisDeviceID!, name: thisName, type: thisType)
                callback(thisInfoPacket)
            }
        } catch {
            print(error)
        }
        }.resume()
}

See that callback with the typed object? This is how you access the data from this sample GET call. I have the blocked wrapped up in a method.

func whatAreMyDetails() 
{
    // The response object in the block.
    stAdapter.getSpeakerInformation() { (response) in
        if let responseObj = response as SpeakerInformation! {
            DispatchQueue.main.async {
                self.headerLabel.text = responseObj.name
                self.deviceTypeLabel.text = responseObj.type.uppercased()
            }
        }
    }
}

So you can’t set something up which would allow for this kind of syntax:

let foo = stAdpater.getSpeakerInformation()

I hadn’t done a lot of networking calls in quite some time and I stumbled a bit trying to force the direct operation before I thought about it some more and did some research. You can return an object, you can return an array of objects, a dictionary, all kinds of stuff. Super handy and the block keeps things nice and tidy.

Tagged : / / / / /

UILabel centered text – getting the text’s rect

Swifty

I recently had a section of user interface where I had a UILabel that took up an iPhone’s full width. It contained text which would be dynamic over time, meaning that the text would update at times. I wanted to place an image icon beside the first character of the label, but I needed to know where to place it. I had done it in the past but honestly forgot exactly how to do it.

Kyle Sluder from the Apple Cocoa-Dev list reminded me what I should be looking for.

UILabel.textRect(forBounds:limitedToNumberOfLines:)

Doh! I should have decided to RTFM — read the documentation. You can get a rect for the label, but you can’t use all of it for positioning. Here is sample code in how to get the job done.

let label = UILabel(frame: CGRect(x: 0, y: 100, width: self.view.frame.size.width, height: 30))
label.textAlignment = .center
label.font = UIFont.systemFont(ofSize: 16)
label.numberOfLines = 1
label.text = "This show is crazy."
        
let rect:CGRect = label.textRect(forBounds: label.bounds, limitedToNumberOfLines: 1)
let v = UIView(frame: CGRect(x: rect.origin.x, y: label.frame.origin.y, width: rect.width, height: label.frame.height))
v.backgroundColor = UIColor.red.withAlphaComponent(0.3)
        
self.view.addSubview(label)
self.view.addSubview(v)

Pretty easy. I just forgot how to do it. This is for the whole text, not substrings within it.

 

Getting the rotation angle after CABasicAnimation?

Spinner

I had a view that I rotate a lot, often more than 360 degrees (spins around a few times). Each time it stops, I wanted to determine the resulting “visual” angle. How does one go about doing that?

rotateView is a configured CABasicAnimation:
let rotateView = CABasicAnimation()
let randonAngle = arc4random_uniform(361) + 1440
rotateView.fromValue = 0
rotateView.toValue = Float(randonAngle) * Float(M_PI) / Float(randomBetweenNumbers(firstNum: 90.0, secondNum: 180.0))//180.0
let randomSpeed = randomBetweenNumbers(firstNum: 1.5, secondNum: 2.1)
rotateView.duration = CFTimeInterval(randomSpeed)
rotateView.repeatCount = 0
rotateView.isRemovedOnCompletion = false
rotateView.fillMode = kCAFillModeForwards
rotateView.timingFunction = CAMediaTimingFunction(name: kCAMediaTimingFunctionEaseOut)
innerRing.layer.add(rotateView, forKey: "transform.rotation.z")

Once done, I dispatch after the randomSpeed duration so I know when the animation will complete. I then wanted the resulting angle. I searched all over the place & found a lot of old broken things and things that didn’t work for me.

This, however, does work (I spent an hour at the late night kitchen table trying things out)…

let transform:CATransform3D = innerRing.layer.presentation()!.transform
let angle: CGFloat = atan2(transform.m12, transform.m11)
var testAngle = radiansToDegress(radians: angle)
if testAngle < 0 {
    testAngle = 360 + testAngle
}
print("final: \(testAngle)º")

// Here is the helper function

func radiansToDegress(radians: CGFloat) -> CGFloat {
    return radians * 180 / CGFloat(M_PI)
}

It rocks. Spin the hell out of your view and get the result using the code above.

Tagged : /

watchOS: Adding a complication to an existing project?

Apple Watch Complications

Much of Xcode is wonderful to use in my opinion. You can dig and dig and dig and still find new things in there. However, there are aspects that induce sphincter tightening just thinking about it (without version control). Anything you do to a project file is scary stuff.

Recently I had a project that I attempted to add a watchOS complication to. You can’t undo that. I ended up with a bunch of extra targets, including a new watch extension itself. I wasn’t sure what I was doing and things started producing warnings and errors. I deleted a bunch of stuff  and cleaned the build folder. Whew.

Turns out, it’s quite easy to create a new project with complications present before anything. Most of the project’s code was able to be ported quite easily. Some other things took some time (plist files, user interfaces, etc.)

I have yet to find out how to add a complication to an existing watchOS application. I spent too much time searching the internet for answers. I was actually able to mostly figure out complication templates too – after a lot of trial and error. So much so that I had to use the simulator because I was building so much to get it working.

Tagged : /

keep it simple. Frameworks?

keep it simple

Recently I had a project in a workspace to produce a framework of Objective-C and Swift code – supplied by someone else. I had a devil of a time getting it to build for me. And when I did, I tried to copy it into my own project to use it – to find out somehow the project thought I already had it. So I had to link to it. Which meant building an archive didn’t work, so I added a path for embedded frameworks, and then found out there was a problem with how the framework was built in the first place. I was jumping through hoops. I’m no Xcode guru. It’s great until it doesn’t do what you want.

I spent hours trying to get things to work. Hours. The dry-heaving Xcode project red sweats. The kind you want to roll a d20 saving throw to see if it will make it to the end of the day and live on.

I took the framework project code, copied it to my swift project. Added a bridging header for an Objective-C header, and I was done. In the span of about (really) 10 minutes I could have saved myself hours. Now… if that framework changes I need to stay on top of that. If I used a linked framework under source control, I’d be better. But that framework won’t really change much at all.

Sometimes it really is better to keep it simple.

The horror of Xcode project errors.

The horror.

We have all been there, right? You design the UI, you’ve planned out the architecture and how your application will work. Days go by and you’ve got things nailed down tight. Interactions are smooth, data is flowing like warm nectar from some newly discovered fountain of awesomeness. Your project is ready for the next thing.

You build a framework and add it to your baby. Errors. Huh? Fucking errors.

You start to try different things to get rid of the errors. The errors morph into different errors. You start to edit bits of your project and you’re worried that too much is changing to accommodate this new chunk of code. Sweat on your furrowed brow as you JUST WANT TO BUILD the project again. You’re missing the ability to see your application behave and do what you want. Right now its laughing at you.

When will this stop?!

You roll back and try again. You alleviate certain errors through intuition, luck, and StackOverflow searches. You feel it slipping away from you again, you’re editing stuff all over the place. You’re getting frustrated. Oh, doh! I need to build the framework for generic iOS devices. Oh, you need to add a few keys to your info.plist. You can’t copy the file into the project because Xcode thinks it’s already referenced – so you make a copy of the framework and move it in. Gradually it all starts to come back together. You add stuff to the AppDelegate and the compiler is complaining. You get it link up and work. After hours of pounding agitation.

Of course, we’ve all been there. From 100 miles per hour to 2 miles per hour in a matter of milliseconds. You prayed. You researched. You tried little things to see if they make a difference, even though you knew they wouldn’t. You start to forget all of the things you’ve done to alter your project in the hopes of getting it to build again. You burn through hours. If it was late, you have dreams about what you might try next.

Magic. Thank you!

Some little blip from the darkest recesses of your mind floats up and sparks a solution. Or a path to one. And you walk it with your keyboard in the hopes of lighting the candle in the dank and dark room of despair.

And then it comes together. Somehow it just does. At the tail end of the journey it all seems so obvious and clear. You feel satisfaction in knowing you’re not totally screwed anymore.

That time was well spent in learning how part A goes into part B which depends on part C, D, and E. You shake your head and feel elation when your build completes without error (and any warnings are actually welcomed at that point… yellow is better than red).

Coffee tastes better. You go though and delete all those blocks of random code you ended up commenting out in favor of other mystical trial and error code.

You look at your screen and mutter the words.

“Yeah, well. The Dude abides.”

Sending and playing an audio file from Apple Watch Extension to iOS

Watch Image

I recently played around recording audio on my Apple Watch. Doing that was easy enough, and I wanted to send the recorded file from the watch extension to the iOS application – and play it. I started messing around and used sendFile as the communication protocol since it can work in the background (without the iOS app needing to be open). I banged on it a ton and it worked – but I couldn’t figure out a way to actually play it. The file.fileURL.path had some dynamic stuff in the URL which prevented me from successfully creating a valid AVAudioPlayer.

Here is my solution in sending a file (wav) to the iPhone and playing it upon receipt.

//Watch Extension code

override func awake(withContext context: Any?) {
    super.awake(withContext: context)
        
    if WCSession.isSupported() {
        WCSession.default().delegate = self
        WCSession.default().activate()
    }
        
    let fileManager = FileManager.default
    let container = fileManager.containerURL(forSecurityApplicationGroupIdentifier: "group.net.ericd.WatchRecord")
    let fileName = "audioFile.wav"
    // class variable
    saveURL = container?.appendingPathComponent(fileName) as NSURL?    
}

@IBAction func sendAudio() {
    let data = NSData(contentsOf: saveURL as! URL)
    sendAudioFile(file: data!) // Quicker.
}

func sendAudioFile(file: NSData) {
    WCSession.default().sendMessageData(file as Data, replyHandler: { (data) -> Void in
        // handle the response from the device
    }) { (error) -> Void in
        print("error: \(error.localizedDescription)")
    }
}

The presentAudioRecorderController, when successfully saving the audio file recorded, enables a button that calls the sendAudio function. That code is here.

@IBAction func recordAudio() {
    let duration = TimeInterval(10)
    let recordingOptions = [WKAudioRecorderControllerOptionsMaximumDurationKey: duration]
    print("record:", saveURL as! URL)
    presentAudioRecorderController(withOutputURL: saveURL as! URL,
                                                    preset: .narrowBandSpeech,
                                                    options: recordingOptions,
                                                    completion: { saved, error in
                                                        
                                                    if let err = error {
                                                        print(err.localizedDescription)
                                                    }
                                                        
                                                    if saved {
                                                        print("saved file.")
                                                        self.playButton.setAlpha(1.0)
                                                        self.sendButton.setAlpha(1.0)
                                                        self.playButton.setEnabled(true)
                                                        self.sendButton.setEnabled(true)
                                                    }
    })
}

Now, in the iOS application, I handle the receipt of the sendMessageData method.

func session(_ session: WCSession, didReceiveMessageData messageData: Data, replyHandler: @escaping (Data) -> Void)
{
    DispatchQueue.main.async
    {
        self.someLabel.text = "We got an audio file: \(messageData)" //Show bytes
        self.versionLabel.textColor = UIColor.blue
            
        do {
            self.player = try AVAudioPlayer(data: messageData)
            guard self.player != nil else { return }
            self.player?.prepareToPlay()
            self.player?.play()
                
        } catch let error as NSError {
            print(error.localizedDescription)
        }
    }
}

And there the audio file is rendered upon receipt. Not using a file transfer (which I would prefer since it’s a background task that doesn’t require the iOS application to be running in the foreground).

Tagged :

watchOS3 communication to iOS (speed)

Speed with caution

Recently I was working with watchOS 3 and iOS – transferring data back and forth (not using reply callbacks). I was using .sendMessage, and messages from the iPhone to the watch were very quick. However, messages from the watch to the phone seemed very slow, or so I thought.

This is what my receive delegate looked like on the iPhone side of the BLE fence.

// Message received from the Apple Watch Extension.
func session(_ session: WCSession, didReceiveMessage message: [String : Any])
{
    let val = message["ancValue"] as! String
    messageFromWatchLabel.text = "ANC: \(val)"
}

That took anywhere from 1 to 25 seconds to populate the UILabel. Now, I miraculously remembered reading somewhere that on the iPhone, the delegate call comes back on a background thread. My eyes lit up, and I tried updating my UI on the main thread (where it needs to happen).

// Message received from the Apple Watch Extension.
func session(_ session: WCSession, didReceiveMessage message: [String : Any])
{
    let val = message["ancValue"] as! String
    // Main thread - super snappy.
    DispatchQueue.main.async {
        self.messageFromWatchLabel.text = "ANC: \(val)"
        self.slider.value = Float(val)!
        self.sliderValue.text = "\(Int(val)!)"
    }
}

Shazaam. Snappy as can be. So try not to forget about this. I didn’t need to do this on the watch side because I assume there is no threading on the watch (that we have access to)? It’s very snappy in response to the same delegate on the watch side, so I’m not sure what’s going on there. But that main thread for iOS saved my bacon.

Tagged :

Swift 3 appearance extensions

Appearance
I just read about this and smacked myself on the forehead. Normally when developing an application I follow some sort of branding guidelines or perhaps make use of a custom font. I think most developer/designers DOI this quite regularly.

Now, I’ve found myself coding to those conventions over and over for various labels and like user interface controls. It’s a pain in the ass. If it needs to change you can search and replace in the project, or use messier methods. Not fun and prone to errors.

Extensions. We all use them too. However, they can be used to simplify your code and maintain consistency application-wide. You can put these in a separate Swift file to make them easier to track down.

/** Appearance .swift */ 
import UIKit 
extension UIColor { 
    // Custom colours go here 
    func custom() -> UIColor { 
        return UIColor(red: 0.45, green: 0, blue: 0.95, alpha: 1) 
    } 
} 
extension UIFont { 
    // Custom fonts go here 
    func custom(size:CGFloat) -> UIFont { 
        return UIFont(name: "Avenir Next", size: size)! 
    }
}

I’ll reformat the above code soon, I’m on my phone at the moment. But look at that. It can be so simple to make edits to affect a great number of controls.

Here is an example implementation of the extension.

// Use the extensions from Appearance .swift 
label.textColor = UIColor().custom() 
label.font = UIFont().custom(size: 20)

How about using static variables? Excellent and clean!

import UIKit 
extension UIColor { 
    // Static, computed variable for custom colour 
    static var custom:UIColor { 
        return UIColor(red: 0.45, green: 0, blue: 0.95, alpha: 1) 
    }
}

Now here is the implementation using a static variable.

label.textColor = .custom

Shazam. To me, this is really excellent. Easy to use, easy to modify, and it keeps your view controllers and the like slimmer and thus easier to read. No long lists of label modifications. You can take this concept quite far if you’d like to.