Categories
Programming What I’m Up To

My solution to Advent of Code 2020’s Day 4 challenge, in Python

Welcome to another installment in my Advent of Code 2020 series, where I present my solutions to this year’s Advent of Code challenges!

In this installment, I share my Python solution to Day 4 of Advent of Code, a.k.a. “The Toboggan Puzzle”.

Spoiler alert!

Please be warned: If you want to try solving the challenge on your own and without any help, stop reading now! The remainder of this post will be all about my solution to both parts of the Day 4 challenge.

The Day 4 challenge, part one

The challenge

Here’s the text from part one of the challenge:

You arrive at the airport only to realize that you grabbed your North Pole Credentials instead of your passport. While these documents are extremely similar, North Pole Credentials aren’t issued by a country and therefore aren’t actually valid documentation for travel in most of the world.

It seems like you’re not the only one having problems, though; a very long line has formed for the automatic passport scanners, and the delay could upset your travel itinerary.

Due to some questionable network security, you realize you might be able to solve both of these problems at the same time.

The automatic passport scanners are slow because they’re having trouble detecting which passports have all required fields. The expected fields are as follows:

  • byr (Birth Year)
  • iyr (Issue Year)
  • eyr (Expiration Year)
  • hgt (Height)
  • hcl (Hair Color)
  • ecl (Eye Color)
  • pid (Passport ID)
  • cid (Country ID)

Passport data is validated in batch files (your puzzle input). Each passport is represented as a sequence of key:value pairs separated by spaces or newlines. Passports are separated by blank lines.

Here is an example batch file containing four passports:

ecl:gry pid:860033327 eyr:2020 hcl:#fffffd
byr:1937 iyr:2017 cid:147 hgt:183cm

iyr:2013 ecl:amb cid:350 eyr:2023 pid:028048884
hcl:#cfa07d byr:1929

hcl:#ae17e1 iyr:2013
eyr:2024
ecl:brn pid:760753108 byr:1931
hgt:179cm

hcl:#cfa07d eyr:2025 pid:166559648
iyr:2011 ecl:brn hgt:59in

The first passport is valid – all eight fields are present. The second passport is invalid – it is missing hgt (the Height field).

The third passport is interesting; the only missing field is cid, so it looks like data from North Pole Credentials, not a passport at all! Surely, nobody would mind if you made the system temporarily ignore missing cid fields. Treat this “passport” as valid.

The fourth passport is missing two fields, cid and byr. Missing cid is fine, but missing any other field is not, so this passport is invalid.

According to the above rules, your improved system would report 2 valid passports.

Count the number of valid passports – those that have all required fields. Treat cid as optional. In your batch file, how many passports are valid?

Importing the data

Every Advent of Code participant gets their own set of data. I copied my data and went through my usual process of bringing it into Python. This involves pasting it into a triple-quoted string and assigning it to the variable raw_input.

raw_input = """pid:827837505 byr:1976
hgt:187cm
iyr:2016
hcl:#fffffd
eyr:2024

hgt:189cm byr:1987 pid:572028668 iyr:2014 hcl:#623a2f
eyr:2028 ecl:amb

pid:#e9bf38 hcl:z iyr:2029 byr:2028 ecl:#18f71a hgt:174in eyr:2036

hcl:#cfa07d byr:1982 pid:573165334 ecl:gry eyr:2022 iyr:2012 hgt:180cm

cid:151 hcl:#c0946f
ecl:brn hgt:66cm iyr:2013 pid:694421369
byr:1980 eyr:2029

ecl:brn
pid:9337568136 eyr:2026
hcl:#6b5442
hgt:69cm iyr:2019 byr:2025

cid:66 hcl:#efcc98 pid:791118269 iyr:2013
eyr:2020 ecl:grn hgt:183cm byr:1993

eyr:2022
hgt:160cm iyr:2016 byr:1969 pid:767606888 ecl:gry hcl:#6b5442

hgt:157cm eyr:2026 ecl:oth hcl:#efcc98 byr:1938 iyr:2014

byr:1931 iyr:2015
ecl:gry
hgt:76in
cid:227 hcl:#09592c eyr:2024 pid:276365391

ecl:gry hgt:170cm iyr:2014 cid:285 pid:870052514
hcl:#866857 byr:1925 eyr:2025

eyr:2021
byr:1960 pid:569950896
iyr:2010 hgt:179cm hcl:#888785 cid:167

hgt:154in cid:194
pid:8142023665 byr:2010 hcl:7d22ff ecl:utc iyr:2026 eyr:1976

ecl:blu eyr:2030 hgt:192cm
pid:363860866 iyr:2019 hcl:#ceb3a1 byr:1963

byr:1947 hgt:167cm hcl:#7d3b0c ecl:amb
cid:70 eyr:2022 iyr:2019 pid:756932371

hgt:185cm pid:871945454
iyr:2020
hcl:#866857 ecl:amb
byr:1989 cid:184 eyr:2030

byr:1935 pid:322117407
hgt:153cm iyr:2011
cid:244 eyr:2022 hcl:#efcc98 ecl:hzl

ecl:blu hcl:#5e6c12
eyr:2029 iyr:2011 hgt:191cm byr:1992

hcl:#7d3b0c eyr:2029
hgt:163cm
pid:625292172 byr:1932 ecl:brn
iyr:2020

hgt:158cm
eyr:2030 iyr:2016 byr:1969
cid:173 pid:092921211 hcl:#602927 ecl:grn

hcl:#733820
iyr:2016 eyr:2029
ecl:hzl hgt:180cm pid:292904469 byr:1984

ecl:amb pid:901224456 hgt:190cm
iyr:2013
hcl:#733820
byr:1922

pid:262285164 iyr:2010
byr:2018 eyr:2026 hcl:#602927 hgt:179cm ecl:gmt cid:349

byr:1956 eyr:2027 pid:351551997 hgt:71in cid:277 hcl:#cfa07d iyr:2010 ecl:grn

eyr:2027 hcl:#602927 hgt:157cm ecl:gry
cid:128 byr:1953
pid:231551549 iyr:2012

iyr:2011 pid:771266976
cid:264 byr:1955 hcl:#b6652a
hgt:189cm ecl:blu
eyr:2030

eyr:2026 pid:698455242
byr:1949 ecl:gry hgt:190cm
iyr:2013 hcl:#efcc98 cid:139

ecl:blu hgt:181cm byr:1977 iyr:2011 eyr:2022
pid:454163967 hcl:#b6652a

pid:534506872 hgt:155cm iyr:2012
byr:1968
cid:333 eyr:2024 hcl:#623a2f
ecl:amb

hgt:162cm
iyr:2020
hcl:#733820 eyr:2027 byr:1995 ecl:gry pid:084994685

iyr:2016 byr:1990
ecl:amb pid:185689022 eyr:2025
hgt:184cm hcl:#866857

byr:2016 hcl:z iyr:2022 hgt:166in
eyr:2040

byr:1943 hgt:152cm hcl:#cfa07d ecl:hzl iyr:2016 cid:300 pid:376088014

iyr:2020 eyr:2026 hcl:#602927 ecl:gry byr:1962 pid:453907789 hgt:172cm

eyr:2023 hgt:185cm
hcl:#623a2f pid:963767258 byr:1977
iyr:2019 ecl:oth

hgt:159cm byr:1965 cid:349 ecl:blu pid:962908167
iyr:2013 eyr:2024
hcl:#fffffd

eyr:2026
pid:912822238 hgt:66in byr:1985 iyr:2018 hcl:#c0946f ecl:hzl

hgt:167cm hcl:#ceb3a1
byr:1990 eyr:2027 ecl:grn
iyr:2011 pid:642877667

hcl:#7d3b0c byr:1921 pid:976412756 hgt:192cm
iyr:2013 ecl:gry

iyr:2030 pid:283599139
eyr:2039 cid:203
hcl:f943cb
hgt:111

hgt:190cm
iyr:2027 ecl:blu hcl:z
byr:2004 eyr:2039
pid:734570034

hcl:#6b5442 hgt:191cm
ecl:oth byr:1989 pid:669414669 cid:196 iyr:2016 eyr:2023

ecl:brn eyr:2028 byr:1965 pid:630674502 hcl:#602927 iyr:2020 hgt:61in

iyr:2016 eyr:2022 cid:225
hcl:#733820 ecl:hzl hgt:166cm
byr:1934
pid:232742206

ecl:amb hcl:#602927 eyr:2029
pid:897535300
hgt:189cm byr:1952
iyr:2017

pid:853604345
hgt:161cm cid:269
hcl:#fffffd eyr:2030 iyr:2011 ecl:grn byr:1966

hgt:151cm hcl:#18171d eyr:2026 ecl:grn iyr:2016 pid:176cm
byr:2000

hcl:#341e13
eyr:2022
pid:536989527 cid:73 byr:1971
ecl:hzl

pid:739005658 hcl:#b6652a
eyr:2026 hgt:154cm ecl:hzl
iyr:2019 byr:1935

pid:373465835 ecl:oth byr:1932 cid:333 hgt:165cm
hcl:#b6652a eyr:2021 iyr:2014

byr:1967 pid:486658617 hcl:#18171d hgt:174cm
eyr:2021 iyr:2015 ecl:gry cid:53

eyr:2024
cid:124 iyr:2017 hgt:152cm pid:095649305 hcl:#341e13
byr:1920 ecl:oth

hcl:#623a2f
byr:1951 pid:993284548
cid:106
hgt:186cm
ecl:amb iyr:2017 eyr:2029

cid:308 pid:080673934
hgt:193cm
byr:1967 hcl:#623a2f iyr:2016 ecl:hzl
eyr:2021

iyr:2010 eyr:2024 byr:1946 hgt:156cm
cid:199
ecl:blu hcl:#866857

ecl:blu byr:1955 eyr:2022 cid:95 pid:139391569
iyr:2019 hgt:180cm
hcl:#efcc98

ecl:brn pid:579889368
eyr:2023 hgt:158cm byr:1935
iyr:2018 hcl:#cfa07d

byr:1920 pid:90919899 hcl:#18171d
hgt:152cm
eyr:2029 ecl:oth iyr:2014

byr:1961 eyr:2024
ecl:#d401e3 iyr:2011 hgt:172cm pid:919145070
cid:100
hcl:#efcc98

ecl:gry
hgt:168cm
hcl:#888785 byr:1942 pid:731032830 iyr:2014
eyr:2028

hcl:#6b5442 pid:265747619 hgt:191cm
cid:217
eyr:2028
iyr:2019 ecl:amb
byr:1948

iyr:2011 ecl:brn
hgt:183cm hcl:#fffffd cid:258 byr:1983
pid:835909246

byr:2030
iyr:2024 ecl:#f66808
hcl:fd548d cid:183
pid:#fced33
hgt:160in

ecl:utc hgt:183in hcl:a92c31 pid:0394222041
iyr:2008
eyr:1976 byr:2020

pid:126195650 iyr:2019 hcl:#341e13
ecl:blu
hgt:150cm
eyr:2025
byr:1964

cid:71 iyr:2016 hgt:157 ecl:grt
hcl:#18171d pid:#1ab5ea eyr:2027

eyr:2026 hcl:#b5266f
byr:1971
cid:269 hgt:192cm iyr:2012
pid:736578840 ecl:amb

pid:152109472 hcl:#ceb3a1 ecl:grn hgt:188cm eyr:2027
byr:1923

hcl:#341e13 pid:535175953 hgt:63in eyr:2028 iyr:2015 byr:1999 ecl:gry

hgt:183cm pid:611738968 byr:2001
eyr:2020 hcl:#a97842 iyr:2014
ecl:gry

eyr:2038 ecl:gmt pid:113210210 iyr:2012 byr:2011
hcl:z
hgt:157cm

hgt:157cm
pid:699449127
iyr:2014 ecl:gry byr:1980 hcl:#fffffd eyr:2029

iyr:2028 hcl:z pid:152cm
eyr:2039
ecl:#4760fb hgt:177in
byr:2017

eyr:2026 hcl:#efcc98
iyr:2020 hgt:180cm ecl:hzl pid:747449965 byr:2016

byr:1974 iyr:2019
cid:89 eyr:2023 pid:421418405
hcl:#fffffd hgt:192cm
ecl:gry

hcl:26c2ef eyr:2029 cid:309 byr:1931 ecl:grn pid:#4eb099 iyr:2024
hgt:174cm

ecl:gry
hgt:183cm
cid:281
eyr:2022 pid:050492569
byr:1968 hcl:c88145
iyr:2015

eyr:2028
iyr:2014 pid:712984515 hgt:187cm cid:206 hcl:#866857 byr:1927
ecl:brn

byr:1936 hgt:61in ecl:oth iyr:2012 pid:447813841
hcl:#c0946f
cid:126 eyr:2021

ecl:gry pid:791970272
eyr:2020
byr:1932 hcl:#623a2f hgt:161cm
iyr:2015

hcl:#c0946f
byr:1935 pid:721144576 eyr:2025 hgt:162cm
iyr:2017 ecl:oth

byr:1959
pid:551109135
ecl:hzl hgt:68in
eyr:1977 hcl:#888785
iyr:1955 cid:100

hgt:190in eyr:1993 pid:8358180772 iyr:1975
ecl:oth
byr:2024
hcl:3de172

eyr:2030 hgt:190cm hcl:#a40ef3 byr:1935 pid:484932501
ecl:amb iyr:2016

iyr:2015
byr:1964
hgt:176cm
pid:819552732 hcl:#c0946f ecl:amb cid:263
eyr:2024

hgt:65cm cid:59 eyr:2027 pid:074880819 ecl:utc iyr:2023
byr:1954 hcl:#623a2f

byr:1954 hgt:167cm iyr:2020
eyr:2023 hcl:#602927
pid:280295309
ecl:hzl cid:168

hgt:168cm pid:311043701 iyr:2017 byr:1965
ecl:hzl
eyr:2026 hcl:#fffffd

hcl:#fffffd ecl:grn pid:672987232 iyr:2012 eyr:2022 hgt:66in

iyr:2012 ecl:#6f4f9f
hgt:133 byr:1937
eyr:1953 pid:7177768428 hcl:#602927

iyr:2010
byr:1922 hcl:#c0946f
eyr:2029 ecl:gry
hgt:165cm
pid:893045052

iyr:2013 eyr:2028 hcl:#866857 pid:137143403
ecl:brn hgt:170cm byr:1940 cid:194

hgt:161cm
eyr:2027 pid:3966920279 ecl:gry iyr:2015 byr:1997 hcl:#cfa07d

ecl:amb
hgt:157cm byr:1971
pid:562746894 cid:305 hcl:#0b0e1a eyr:2021 iyr:2016

hcl:8b821d hgt:157cm pid:187cm cid:298 eyr:1926 iyr:2019
ecl:amb
byr:2030

hgt:155cm hcl:#341e13 byr:1924 pid:779847670
ecl:hzl iyr:2015
eyr:2024

pid:768590475 hcl:#a97842 iyr:2014 cid:128 eyr:2029
ecl:oth hgt:164cm byr:1990

iyr:2019 hgt:181cm cid:342
eyr:2020 ecl:gry byr:2001
hcl:#623a2f
pid:473165431

byr:1928 eyr:2026 hcl:#42a9cb iyr:2010
ecl:grn hgt:157cm pid:638074984

eyr:2028
byr:1951
pid:239781647 iyr:2020 hgt:156cm
ecl:hzl cid:215 hcl:#efcc98

pid:636605355 ecl:hzl
iyr:2017 cid:323 eyr:2025
byr:1995
hcl:#18171d hgt:187cm

byr:1933 hcl:#866857 hgt:152cm ecl:oth iyr:2014 pid:900790914 eyr:2030 cid:267

ecl:brn byr:1999 eyr:2027 hcl:#623a2f iyr:2017
pid:853165955
hgt:152cm

eyr:2030 pid:316704688 hcl:#c0946f ecl:brn iyr:2014 hgt:193cm

iyr:2012 byr:1928
hgt:154cm pid:570535769 hcl:#623a2f eyr:2026 ecl:hzl

iyr:2016 cid:252 eyr:2030 hcl:#888785
hgt:177cm ecl:grn byr:2002 pid:568715162

pid:570999226 iyr:2012 hgt:150cm
byr:2024
ecl:brn hcl:z eyr:2029

pid:174002299 iyr:2019 hcl:#cfa07d ecl:brn byr:1927
cid:77 hgt:159cm eyr:2027

ecl:#d16191 eyr:2022 pid:166cm hgt:165cm hcl:#18171d iyr:2015

pid:112585759
hcl:#341e13 eyr:2025 byr:1962 hgt:164cm ecl:hzl iyr:2018

pid:478415905 eyr:2025 cid:315
ecl:amb hgt:91
iyr:2014 hcl:#cc9d80
byr:1985

pid:561885837 hcl:#7d3b0c
hgt:169cm
byr:1921 iyr:2014 cid:178
eyr:2022 ecl:gry

ecl:#c87497 hcl:5321a2 eyr:2020 hgt:74in
pid:#7a62c6 iyr:1976

eyr:2037
pid:858202391 hgt:162cm
ecl:grn byr:2003
cid:278
iyr:2010 hcl:cbf662

ecl:blu iyr:2012 hgt:183cm hcl:#623a2f pid:848200472 byr:1997 eyr:2027

byr:1942
hgt:164cm
pid:464257339
iyr:2016
hcl:#7d3b0c ecl:gry

iyr:2012 hcl:#ceb3a1
hgt:193cm ecl:amb
pid:667987561 eyr:2024 byr:1960

hgt:187cm
pid:222340640
iyr:2018 eyr:2022
ecl:oth
byr:1957
hcl:#336667 cid:83

eyr:2025 iyr:2015 hcl:#733820
ecl:brn
pid:131195653

hgt:185cm eyr:2026
ecl:amb byr:1998 pid:938587659 hcl:#733820
iyr:2016

ecl:oth pid:300949722
eyr:2028 iyr:2016
byr:1933
hgt:179cm
hcl:#cfa07d

byr:1974 iyr:2019
ecl:hzl hcl:#c0946f eyr:2024 pid:484547079
cid:112
hgt:185cm

eyr:2022 iyr:2018 hcl:#fffffd pid:118568279
hgt:153cm ecl:gry byr:1941 cid:341

iyr:2018
eyr:2027 hcl:#888785
byr:1970 hgt:165cm pid:773715893
ecl:amb

hcl:#623a2f hgt:156cm byr:1938 iyr:2012 pid:745046822
ecl:amb
eyr:2030

iyr:2012
pid:097961857
eyr:2023 hgt:66in hcl:#fffffd byr:1962 ecl:utc

byr:1943 hgt:150cm
iyr:2012
pid:740693353 eyr:2023
hcl:#18171d cid:101 ecl:blu

iyr:2018 pid:183728523 byr:1924 hgt:154cm eyr:2030
cid:167 ecl:blu hcl:#ceb3a1

hgt:69cm
eyr:2025 hcl:z ecl:brn byr:1982 pid:250782159
iyr:2011

byr:1998 iyr:2018 hcl:#341e13 eyr:2022 hgt:157cm pid:497100444 cid:266 ecl:gry

eyr:2027 iyr:2011 hcl:#6b5442 hgt:156cm pid:494073085
byr:1998
ecl:hzl

byr:1947 hcl:#b6652a
iyr:2011 pid:228986686 eyr:2030 hgt:175cm cid:70 ecl:brn

eyr:2026 hgt:159cm
byr:1946 pid:534291476
iyr:2018 ecl:gry cid:225
hcl:#18171d

pid:439665905
cid:311 ecl:amb iyr:2018
eyr:2030
hgt:186cm byr:1950
hcl:#cfa07d

pid:250175056 hcl:#efcc98
byr:1981 cid:262 hgt:154cm ecl:gry iyr:2020 eyr:2027

pid:461335515 iyr:2014 hcl:#f1cf00 hgt:180cm ecl:amb eyr:2027
byr:1956

iyr:2014 eyr:2030 cid:194
pid:234623720 hcl:#733820
hgt:164cm byr:1929
ecl:blu

byr:1992
eyr:2024 hcl:#ef8161 cid:216
ecl:brn hgt:177cm iyr:2018
pid:101726770

hcl:#341e13 hgt:178cm iyr:2016 eyr:2029 byr:1945 pid:045325957 ecl:grn cid:99

ecl:gry
iyr:2012
cid:52 hgt:168cm byr:1943
hcl:#cfa07d
pid:899608935 eyr:2030

cid:241
byr:1934 hgt:161cm eyr:2027 iyr:2011 hcl:#c0946f ecl:amb pid:346857644

iyr:2019 hgt:178cm
hcl:#c0946f byr:1957
eyr:2026
ecl:brn pid:222885240

ecl:blu
eyr:2021 cid:312 hcl:#733820 hgt:186cm iyr:2012 byr:1969
pid:821704316

hcl:#6b5442 cid:159
hgt:180cm
iyr:2018
eyr:2028
ecl:hzl byr:1966
pid:#e0238e

pid:622400994 eyr:2022 hcl:#5b6635 iyr:2012 byr:1980
hgt:190cm ecl:oth

byr:1976 ecl:gry eyr:2020 iyr:2020 hgt:171cm pid:219878671 hcl:#6b5442

hgt:163cm byr:1968
pid:003521394 ecl:oth
iyr:2010
cid:61 hcl:#888785

cid:115 pid:810722029 hgt:166cm byr:1955
ecl:blu eyr:2030 iyr:2018

hgt:176cm
eyr:2025
pid:617393532 hcl:#733820 byr:1975 iyr:2018 ecl:grn

hcl:#733820 byr:1979 pid:838168666
hgt:190cm ecl:oth cid:330
eyr:2029 iyr:2018

eyr:1940 hgt:67cm iyr:2009 ecl:gry pid:#e76a62 byr:2020 hcl:z

hgt:190cm ecl:brn pid:396113351
byr:1956 iyr:2010
hcl:#6b5442 eyr:2024
cid:256

hcl:#efcc98
hgt:178cm byr:1984 iyr:2013 pid:752620212 eyr:2021 ecl:gry

iyr:2014 hcl:#a97842
hgt:166cm ecl:blu eyr:2024
byr:1935
pid:836748873

cid:236 ecl:amb hgt:168cm iyr:2010 hcl:#602927 byr:1950 eyr:2026 pid:404810674

eyr:2030 ecl:grn
byr:1975 pid:064596263 hgt:193cm
iyr:2019 cid:71 hcl:#a97842

iyr:2014
pid:298386733 hcl:#c0946f
hgt:180cm ecl:hzl cid:115 byr:1940 eyr:2023

iyr:1960 hgt:139 ecl:#9db7b8 byr:1980 pid:#ef597b cid:54 eyr:2028 hcl:fdcda3

iyr:2015 byr:1954 ecl:blu hgt:62in hcl:#ceb3a1 pid:253593755 eyr:2028

eyr:2025 ecl:blu pid:216388098 iyr:2017 byr:1968 hgt:151cm hcl:#602927

eyr:2022 hcl:#a97842
pid:606979543 iyr:2013 ecl:grn cid:63
hgt:186cm byr:1992

ecl:gry
hgt:168cm hcl:#18171d iyr:2017 pid:670898814 byr:1983
eyr:2022

hgt:155cm ecl:grn iyr:2012 pid:837979074 eyr:2024 hcl:#888785 byr:1972

iyr:2015 pid:970743533 hcl:#866857 eyr:2027
byr:1921 ecl:brn

eyr:2022
hgt:160cm
byr:1964 hcl:#efcc98 iyr:2019 ecl:oth pid:141923637

byr:2029 pid:3313111652 ecl:brn eyr:2034
iyr:2013 hgt:193cm hcl:z

pid:853890227 eyr:2029
hcl:#efcc98 iyr:2021 byr:2003 ecl:#037c39 hgt:160cm

iyr:1927
byr:1992
eyr:2030
hcl:#efcc98
ecl:amb hgt:152cm pid:436765906

iyr:2014
hcl:#c0946f pid:207052381
eyr:2024 ecl:hzl
hgt:177cm
byr:1923

ecl:blu
iyr:2014
eyr:2025 hgt:165cm
hcl:#733820 pid:343011857 byr:1967

ecl:xry
eyr:2028
iyr:2011 hgt:166in hcl:#c0946f
pid:805297331
cid:167 byr:1926

byr:1947
pid:468012954 eyr:2026 ecl:oth iyr:2018 hgt:170cm hcl:#b6652a

hcl:#6b5442 ecl:brn
hgt:180cm cid:233
pid:029789713
byr:1920 iyr:2010 eyr:2024

iyr:2010 eyr:2027
hgt:156cm
hcl:#c0946f
byr:1960 pid:312723130 ecl:hzl

eyr:2023 byr:1959 iyr:2010 hgt:186cm pid:066768932 ecl:grn hcl:#602927 cid:310

eyr:2030 pid:460535178 hgt:171cm ecl:gry iyr:2020 byr:1934 hcl:#888785

hgt:64cm eyr:2021 byr:1995 cid:336
ecl:gmt pid:926714223 iyr:2017 hcl:#18171d

eyr:2022 iyr:2010
ecl:grn pid:285994301 cid:215
hgt:186cm byr:1978

hgt:63in hcl:#866857
pid:386128445 iyr:2020 byr:1971 eyr:2021 ecl:gry

hgt:183cm hcl:#733820 iyr:2015
ecl:blu pid:216205626 eyr:2022 byr:1941

cid:150 ecl:amb pid:872515243 byr:1926
eyr:1996
hcl:#dedc39 hgt:67in iyr:2020

byr:1927 ecl:brn cid:153 iyr:2011
pid:165190810 hcl:#fffffd
eyr:2028 hgt:64in

pid:502603734
byr:1966 iyr:2015 hgt:176cm cid:205 ecl:brn hcl:#fffffd eyr:2021

hcl:#18171d hgt:158cm byr:1943 iyr:2019
pid:058840094
eyr:2023

byr:1962 hcl:#b6652a ecl:grn
cid:297
iyr:2010 pid:990422650
hgt:154cm eyr:2020

eyr:1934 iyr:2011
ecl:gry
hcl:z byr:2004 hgt:63cm pid:6173356201

pid:329432364 eyr:2029
ecl:grn hcl:#18171d iyr:2013
hgt:158cm byr:1960

hcl:#efcc98 iyr:2016 hgt:186cm cid:215
pid:852781253 eyr:2027 ecl:blu byr:1937

hcl:#623a2f ecl:gry iyr:2020 byr:1972 hgt:182cm pid:073426952 eyr:2027

hcl:#3317b9 byr:1950 pid:304511418 hgt:177cm cid:124 eyr:2020 ecl:hzl iyr:2014

eyr:2029
pid:034754507 byr:1936
cid:265 ecl:#b50997 hgt:183cm
hcl:#623a2f iyr:1924

eyr:2024 byr:1927 cid:243 ecl:gry hcl:#6b5442 pid:714355627 hgt:160cm
iyr:2016

hgt:152cm
ecl:gry hcl:#a97842
eyr:2029 byr:1952
pid:555308923 iyr:2010

byr:2008
pid:19681314 hgt:180in iyr:2030 ecl:gry cid:272
eyr:2023
hcl:#b6652a

cid:234
iyr:2014 byr:1940 ecl:hzl pid:042231105 hcl:#3bf69c hgt:172cm eyr:2029

hcl:#efcc98 pid:831567586 hgt:190cm iyr:2017
byr:1966 eyr:2024 ecl:blu

hcl:#341e13 ecl:blu
eyr:2022 cid:161 pid:197839646 iyr:2014

hcl:#cfa07d
byr:1957
iyr:2019 hgt:181cm
pid:543775141 ecl:oth eyr:2021

hcl:z
pid:#596c41 eyr:2035
byr:2008 iyr:1975
ecl:#c66ee6
hgt:150in

ecl:grn
hcl:#7d3b0c iyr:2016
pid:804255369 eyr:2028 byr:1983 hgt:69in cid:82

eyr:2022
iyr:2013 hgt:191cm ecl:gry
hcl:#a97842 pid:186827268 byr:1969

pid:871672398 eyr:2026 byr:1946 ecl:oth
iyr:2015
hcl:#866857 hgt:185cm

byr:1973
hgt:150cm
pid:905076707
iyr:2017
hcl:#2edf01 ecl:oth cid:221 eyr:2026

eyr:2024 ecl:grn pid:955444191 hcl:z iyr:2015 byr:2008 hgt:151cm

byr:1958 hcl:#fffffd pid:218986541 cid:203 ecl:brn hgt:154cm
iyr:2014
eyr:2026

hcl:#623a2f byr:1964 ecl:oth iyr:2010 pid:525843363 hgt:164cm eyr:2025

ecl:blu iyr:2013 hgt:193cm byr:1990 pid:612387132 hcl:#18171d cid:280 eyr:2028

ecl:oth eyr:2022
pid:110447037 hgt:187cm byr:1967 hcl:#efcc98

byr:1930
eyr:2026 hgt:159cm
iyr:2011
ecl:hzl hcl:#6b5442 pid:923471212

cid:350
eyr:2029 pid:823592758 iyr:2018
ecl:grn byr:1972 hgt:167cm hcl:#18171d

cid:76 eyr:2027 hcl:#6b5442 pid:099579798 byr:1930
iyr:2020
ecl:gry hgt:153cm

byr:1957 ecl:brn
hcl:z iyr:2016 pid:352677969 hgt:189cm
eyr:2029

cid:143 eyr:2035 pid:602952079
ecl:#9b73f0 hcl:#602927
iyr:2022 byr:1975
hgt:174cm

byr:1971 pid:741305897 hgt:192cm
ecl:amb hcl:#888785 eyr:2028 iyr:2011

ecl:oth iyr:2016
byr:1942 hgt:189cm hcl:#888785 eyr:2024 pid:054290182

hcl:#a97842
byr:1945
ecl:amb pid:370849304
eyr:2028
iyr:2016 hgt:168cm

hgt:154cm iyr:2015 eyr:2030 byr:1952 ecl:hzl hcl:#341e13 pid:996518075

byr:1941 ecl:amb iyr:2014
hcl:#fffffd pid:560990286 eyr:2022 hgt:173cm

ecl:blu byr:1974
hgt:150cm hcl:#ceb3a1 eyr:2020 iyr:2013
pid:827415351

hcl:#623a2f eyr:2027 iyr:2011 pid:913199234 ecl:oth
byr:1990 hgt:178cm

ecl:blu byr:1989 hcl:#b6652a
eyr:2026 pid:724881482 hgt:185cm iyr:2014

cid:115 pid:255002731 eyr:2025 ecl:amb
byr:1934 iyr:2020 hcl:#7d3b0c

hgt:150cm byr:1969 ecl:blu iyr:2023
hcl:#866857 pid:754288625 eyr:2029

iyr:2011 hcl:#7d3b0c ecl:hzl
byr:1930
hgt:188cm
eyr:2023
pid:256556076 cid:136

iyr:2025 byr:1978
ecl:#fe30a9 hcl:#efcc98 eyr:2029
pid:392032459 hgt:178cm

eyr:2027 iyr:2017 hgt:160in
byr:1990 pid:131099122 hcl:#623a2f ecl:amb

ecl:grn
byr:1978
eyr:2029 hcl:#18171d
hgt:165cm pid:172369888
cid:93
iyr:2011

ecl:hzl
hcl:#733820 iyr:2010 eyr:2029 pid:127253449
hgt:156cm
byr:1963

hcl:#6c8530
iyr:2020
byr:1929 eyr:2021 hgt:177cm ecl:oth pid:347925482

eyr:2037 iyr:2026
pid:163cm
hgt:174in byr:2007 hcl:c1305f cid:134
ecl:#0cf85c

iyr:2011 pid:033811215
hcl:#a97842 byr:2002 eyr:2021 hgt:186cm
ecl:brn

hcl:#a97842
iyr:2020 eyr:2029 byr:1972 pid:535511110 hgt:160cm ecl:oth

ecl:grn cid:89 hgt:193cm pid:73793987 iyr:2021 eyr:2027 byr:1939 hcl:z

hcl:#623a2f
hgt:182cm cid:154
pid:873863966 iyr:2018 byr:1999 ecl:brn eyr:2031

iyr:2014 eyr:2029
cid:71 hcl:#fffffd byr:1924 hgt:63in
ecl:gry pid:897972798

hgt:76cm
hcl:z eyr:1955
iyr:2012 byr:2001 pid:9425090 ecl:hzl

eyr:2021
pid:501861442
ecl:grn hcl:#d71ae9
byr:1977
hgt:167cm iyr:2015

iyr:2014
hgt:170cm ecl:gry byr:1928 cid:314 hcl:#602927 eyr:2029
pid:836710987

eyr:2027 hcl:#efcc98 ecl:amb iyr:2016 byr:1995 pid:603705616 hgt:179cm

eyr:2030 hcl:#602927 cid:105 byr:1943 ecl:hzl
pid:381601507
hgt:188cm iyr:2020

iyr:2011
byr:1993 hcl:#c0946f pid:292649640 hgt:139 ecl:hzl cid:268
eyr:1999

cid:339 byr:1928
ecl:brn eyr:2022 hcl:#733820 hgt:191cm pid:282733347 iyr:2019

hgt:176cm
byr:1935 ecl:brn cid:252 eyr:2023 pid:105060622 iyr:2020 hcl:#18171d

ecl:hzl eyr:2029
hgt:193cm pid:770254253
hcl:#efcc98 iyr:2020 byr:1926

pid:977785261 eyr:2022 iyr:2015 byr:1978
hcl:#733820 hgt:172cm
ecl:brn

byr:2021
hgt:160in
ecl:gmt
eyr:2032 cid:345 pid:179cm
hcl:8f5c13 iyr:2029

iyr:2018 hgt:182cm ecl:gry
pid:897076789 eyr:2023 hcl:#866857
byr:1980

hgt:88 eyr:2039 cid:99 byr:2007 hcl:a1bb42 ecl:#a2f6bb
pid:2264966188
iyr:2022

iyr:2012 cid:59 ecl:gry eyr:2021
byr:1931
hgt:172cm hcl:#7d3b0c pid:862416147

byr:1962 eyr:2025
ecl:grn
hcl:#866857 hgt:180cm iyr:2014 pid:313647071

eyr:2030 hgt:157cm byr:1985
iyr:2020
hcl:#7d3b0c pid:911544768
ecl:grn

hgt:175cm
byr:1938
iyr:2020 ecl:amb hcl:#602927 eyr:2026 pid:144411560

iyr:2019 ecl:amb hcl:#888785 eyr:2025 hgt:187cm
pid:942054361 byr:1939

cid:168 pid:722146139 byr:1952 ecl:grn
iyr:2014 hgt:97
hcl:z
eyr:2023

eyr:2024 pid:567528498 ecl:gry iyr:2012 byr:1990
hcl:#733820 hgt:193cm
cid:293

hcl:#bc352c pid:321838059 byr:1930 hgt:178cm cid:213 eyr:2023 ecl:amb
iyr:2017

hgt:173cm byr:1925 pid:070222017 iyr:2013 hcl:#ceb3a1 ecl:gry eyr:2024"""

I then split() the string into a list, split_input, using two newline characters as the delimiter:

split_input = raw_input.split("\n\n")

Here’s a sample of the result:

['pid:827837505 byr:1976\nhgt:187cm\niyr:2016\nhcl:#fffffd\neyr:2024',
 'hgt:189cm byr:1987 pid:572028668 iyr:2014 hcl:#623a2f\neyr:2028 ecl:amb',
 'pid:#e9bf38 hcl:z iyr:2029 byr:2028 ecl:#18f71a hgt:174in eyr:2036',
 'hcl:#cfa07d byr:1982 pid:573165334 ecl:gry eyr:2022 iyr:2012 hgt:180cm',
 'cid:151 hcl:#c0946f\necl:brn hgt:66cm iyr:2013 pid:694421369\nbyr:1980 eyr:2029',

...

'cid:168 pid:722146139 byr:1952 ecl:grn\niyr:2014 hgt:97\nhcl:z\neyr:2023',
 'eyr:2024 pid:567528498 ecl:gry iyr:2012 byr:1990\nhcl:#733820 hgt:193cm\ncid:293',
 'hcl:#bc352c pid:321838059 byr:1930 hgt:178cm cid:213 eyr:2023 ecl:amb\niyr:2017',
 'hgt:173cm byr:1925 pid:070222017 iyr:2013 hcl:#ceb3a1 ecl:gry eyr:2024']

At this point, each item in the list had its individual components delimited by a mix of spaces and newlines. I used this line of code to convert and newlines to spaces:

split_input_2 = [string.replace("\n", " ") for string in split_input]

Here’s a sample of the result:

['pid:827837505 byr:1976 hgt:187cm iyr:2016 hcl:#fffffd eyr:2024',
 'hgt:189cm byr:1987 pid:572028668 iyr:2014 hcl:#623a2f eyr:2028 ecl:amb',
 'pid:#e9bf38 hcl:z iyr:2029 byr:2028 ecl:#18f71a hgt:174in eyr:2036',
 'hcl:#cfa07d byr:1982 pid:573165334 ecl:gry eyr:2022 iyr:2012 hgt:180cm',

...

'cid:168 pid:722146139 byr:1952 ecl:grn iyr:2014 hgt:97 hcl:z eyr:2023',
 'eyr:2024 pid:567528498 ecl:gry iyr:2012 byr:1990 hcl:#733820 hgt:193cm cid:293',
 'hcl:#bc352c pid:321838059 byr:1930 hgt:178cm cid:213 eyr:2023 ecl:amb iyr:2017',
 'hgt:173cm byr:1925 pid:070222017 iyr:2013 hcl:#ceb3a1 ecl:gry eyr:2024']

I now had an list of single-line strings, each one representing a passport, with each passport’s information delimited by spaces.

My next step was to split() each passport string into a list:

split_input_3 = [string.split() for string in split_input_2]

The result was a master list of password lists. Here’s a sample:

[['pid:827837505',
  'byr:1976',
  'hgt:187cm',
  'iyr:2016',
  'hcl:#fffffd',
  'eyr:2024'],
 ['hgt:189cm',
  'byr:1987',
  'pid:572028668',
  'iyr:2014',
  'hcl:#623a2f',
  'eyr:2028',
  'ecl:amb'],
 ['pid:#e9bf38',
  'hcl:z',
  'iyr:2029',
  'byr:2028',
  'ecl:#18f71a',
  'hgt:174in',
  'eyr:2036'],

...

['hcl:#bc352c',
  'pid:321838059',
  'byr:1930',
  'hgt:178cm',
  'cid:213',
  'eyr:2023',
  'ecl:amb',
  'iyr:2017'],
 ['hgt:173cm',
  'byr:1925',
  'pid:070222017',
  'iyr:2013',
  'hcl:#ceb3a1',
  'ecl:gry',
  'eyr:2024']]

I wanted to convert each password list into a dictionary, so I wrote this function…

def convert_to_dictionary(password_list):
    dictionary = {}
    
    for item in password_list:
        item_parts = item.split(":")
        key = item_parts[0]
        value = item_parts[1]
        dictionary[key] = value
        
    return dictionary

…which I then used that ever-so-useful Python tool, the list comprehension:

passports = [convert_to_dictionary(item) for item in split_input_3]

I now had a list of passport dictionaries:

[{'pid': '827837505',
  'byr': '1976',
  'hgt': '187cm',
  'iyr': '2016',
  'hcl': '#fffffd',
  'eyr': '2024'},
 {'hgt': '189cm',
  'byr': '1987',
  'pid': '572028668',
  'iyr': '2014',
  'hcl': '#623a2f',
  'eyr': '2028',
  'ecl': 'amb'},
 {'pid': '#e9bf38',
  'hcl': 'z',
  'iyr': '2029',
  'byr': '2028',
  'ecl': '#18f71a',
  'hgt': '174in',
  'eyr': '2036'},
 {'hcl': '#cfa07d',
  'byr': '1982',
  'pid': '573165334',
  'ecl': 'gry',
  'eyr': '2022',
  'iyr': '2012',
  'hgt': '180cm'},

...

{'hcl': '#bc352c',
  'pid': '321838059',
  'byr': '1930',
  'hgt': '178cm',
  'cid': '213',
  'eyr': '2023',
  'ecl': 'amb',
  'iyr': '2017'},
 {'hgt': '173cm',
  'byr': '1925',
  'pid': '070222017',
  'iyr': '2013',
  'hcl': '#ceb3a1',
  'ecl': 'gry',
  'eyr': '2024'}]

Strategy

With the input data massaged into a decent data structure, it was time to test the passports to see if they were valid. Valid passports have have all the required keys.

I wrote this function to test the validity of a given passport:

def is_valid_passport(passport):
    has_birth_year = "byr" in passport
    has_issue_year = "iyr" in passport
    has_expiration_year = "eyr" in passport
    has_height = "hgt" in passport
    has_hair_color = "hcl" in passport
    has_eye_color = "ecl" in passport
    has_passport_id = "pid" in passport
    has_country_id = "cid" in passport
    
    return (
        has_birth_year and
        has_issue_year and
        has_expiration_year and
        has_height and
        has_hair_color and
        has_eye_color and
        has_passport_id
    )

With is_valid_passport() written, I could apply it to every passport by way of a list comprehension:

valid_passports = [passport for passport in passports if is_valid_passport(passport)]
print(len(valid_passports))

My result: 228. I entered it into the solution text field, and Advent of Code told me that I was correct! It was time for part two.

The Day 4 challenge, part two

The challenge

Here’s the text of part two:

The line is moving more quickly now, but you overhear airport security talking about how passports with invalid data are getting through. Better add some data validation, quick!

You can continue to ignore the cid field, but each other field has strict rules about what values are valid for automatic validation:

  • byr (Birth Year) – four digits; at least 1920 and at most 2002.
  • iyr (Issue Year) – four digits; at least 2010 and at most 2020.
  • eyr (Expiration Year) – four digits; at least 2020 and at most 2030.
  • hgt (Height) – a number followed by either cm or in:
    • If cm, the number must be at least 150 and at most 193.
    • If in, the number must be at least 59 and at most 76.
  • hcl (Hair Color) – a # followed by exactly six characters 09 or af.
  • ecl (Eye Color) – exactly one of: amb blu brn gry grn hzl oth.
  • pid (Passport ID) – a nine-digit number, including leading zeroes.
  • cid (Country ID) – ignored, missing or not.

Your job is to count the passports where all required fields are both present and valid according to the above rules. Here are some example values:

byr valid:   2002
byr invalid: 2003

hgt valid:   60in
hgt valid:   190cm
hgt invalid: 190in
hgt invalid: 190

hcl valid:   #123abc
hcl invalid: #123abz
hcl invalid: 123abc

ecl valid:   brn
ecl invalid: wat

pid valid:   000000001
pid invalid: 0123456789

Here are some invalid passports:

eyr:1972 cid:100
hcl:#18171d ecl:amb hgt:170 pid:186cm iyr:2018 byr:1926

iyr:2019
hcl:#602927 eyr:1967 hgt:170cm
ecl:grn pid:012533040 byr:1946

hcl:dab227 iyr:2012
ecl:brn hgt:182cm pid:021572410 eyr:2020 byr:1992 cid:277

hgt:59cm ecl:zzz
eyr:2038 hcl:74454a iyr:2023
pid:3556412378 byr:2007

Here are some valid passports:

pid:087499704 hgt:74in ecl:grn iyr:2012 eyr:2030 byr:1980
hcl:#623a2f

eyr:2029 ecl:blu cid:129 byr:1989
iyr:2014 pid:896056539 hcl:#a97842 hgt:165cm

hcl:#888785
hgt:164cm byr:2001 iyr:2015 cid:88
pid:545766238 ecl:hzl
eyr:2022

iyr:2010 hgt:158cm hcl:#b6652a ecl:blu byr:1944 eyr:2021 pid:093154719

Count the number of valid passports – those that have all required fields and valid values. Continue to treat cid as optional. In your batch file, how many passports are valid?

Strategy

In part one, it was about testing for the presence of required keys. This time, it was about testing for valid values.

To that end, I wrote this function…

def has_valid_values(passport):
    has_valid_birth_year = 1920 <= int(passport["byr"]) <= 2002
    has_valid_issue_year = 2010 <= int(passport["iyr"]) <= 2020
    has_valid_expiration_year = 2020 <= int(passport["eyr"]) <= 2030
    
    has_valid_height = False
    height_units = passport["hgt"][-2:]
    if height_units == "cm":
        height = int(passport["hgt"][:-2])
        has_valid_height = 150 <= height <= 193
    elif height_units == "in":
        height = int(passport["hgt"][:-2])
        has_valid_height = 59 <= height <= 76
        
    def is_valid_hex_string(string):
        test_value = string.lower()
        is_valid = True

        for character in string:
            if character not in "0123456789abcdef":
                is_valid = False
                break

        return is_valid
        
    has_valid_hair_color = False
    if len(passport["hcl"]) == 7:
        digits = passport["hcl"][1:]
        has_valid_hair_color = is_valid_hex_string(digits)
            
    has_valid_eye_color = passport["ecl"] in ["amb", "blu", "brn", "gry", "grn", "hzl", "oth"]
    
    def is_valid_passport_id(value):
        is_valid = False
        
        if len(value) == 9:
            is_valid = True

            for character in value:
                if character not in "0123456789":
                    is_valid = False
                    break
        
        return is_valid
    
    has_valid_passport_id = is_valid_passport_id(passport["pid"])
                
        
    return (
        has_valid_birth_year and
        has_valid_issue_year and
        has_valid_expiration_year and
        has_valid_height and
        has_valid_hair_color and
        has_valid_eye_color and
        has_valid_passport_id
    )

…which I then used in a list comprehension, which served as a filter on the validated passports from part one:

truly_valid_passports = [passport for passport in valid_passports if has_valid_values(passport)]
print(len(truly_valid_passports))

My result was 175, which was correct. Day 4 was done!

Categories
Programming What I’m Up To

My solution to Advent of Code 2020’s Day 3 challenge, in Python

Welcome to another installment in my Advent of Code 2020 series, where I present my solutions to this year’s Advent of Code challenges!

In this installment, I share my Python solution to Day 3 of Advent of Code, a.k.a. “The Toboggan Puzzle”.

Spoiler alert!

Please be warned: If you want to try solving the challenge on your own and without any help, stop reading now! The remainder of this post will be all about my solution to both parts of the Day 3 challenge.

The Day 3 challenge, part one

The challenge

Here’s the text from part one of the challenge:

With the toboggan login problems resolved, you set off toward the airport. While travel by toboggan might be easy, it’s certainly not safe: there’s very minimal steering and the area is covered in trees. You’ll need to see which angles will take you near the fewest trees.

Due to the local geology, trees in this area only grow on exact integer coordinates in a grid. You make a map (your puzzle input) of the open squares (.) and trees (#) you can see. For example:

..##.......
#...#...#..
.#....#..#.
..#.#...#.#
.#...##..#.
..#.##.....
.#.#.#....#
.#........#
#.##...#...
#...##....#
.#..#...#.#

These aren’t the only trees, though; due to something you read about once involving arboreal genetics and biome stability, the same pattern repeats to the right many times:

..##.........##.........##.........##.........##.........##.......  --->
#...#...#..#...#...#..#...#...#..#...#...#..#...#...#..#...#...#..
.#....#..#..#....#..#..#....#..#..#....#..#..#....#..#..#....#..#.
..#.#...#.#..#.#...#.#..#.#...#.#..#.#...#.#..#.#...#.#..#.#...#.#
.#...##..#..#...##..#..#...##..#..#...##..#..#...##..#..#...##..#.
..#.##.......#.##.......#.##.......#.##.......#.##.......#.##.....  --->
.#.#.#....#.#.#.#....#.#.#.#....#.#.#.#....#.#.#.#....#.#.#.#....#
.#........#.#........#.#........#.#........#.#........#.#........#
#.##...#...#.##...#...#.##...#...#.##...#...#.##...#...#.##...#...
#...##....##...##....##...##....##...##....##...##....##...##....#
.#..#...#.#.#..#...#.#.#..#...#.#.#..#...#.#.#..#...#.#.#..#...#.#  --->

You start on the open square (.) in the top-left corner and need to reach the bottom (below the bottom-most row on your map).

The toboggan can only follow a few specific slopes (you opted for a cheaper model that prefers rational numbers); start by counting all the trees you would encounter for the slope right 3, down 1:

From your starting position at the top-left, check the position that is right 3 and down 1. Then, check the position that is right 3 and down 1 from there, and so on until you go past the bottom of the map.

The locations you’d check in the above example are marked here with O where there was an open square and X where there was a tree:

..##.........##.........##.........##.........##.........##.......  --->
#..O#...#..#...#...#..#...#...#..#...#...#..#...#...#..#...#...#..
.#....X..#..#....#..#..#....#..#..#....#..#..#....#..#..#....#..#.
..#.#...#O#..#.#...#.#..#.#...#.#..#.#...#.#..#.#...#.#..#.#...#.#
.#...##..#..X...##..#..#...##..#..#...##..#..#...##..#..#...##..#.
..#.##.......#.X#.......#.##.......#.##.......#.##.......#.##.....  --->
.#.#.#....#.#.#.#.O..#.#.#.#....#.#.#.#....#.#.#.#....#.#.#.#....#
.#........#.#........X.#........#.#........#.#........#.#........#
#.##...#...#.##...#...#.X#...#...#.##...#...#.##...#...#.##...#...
#...##....##...##....##...#X....##...##....##...##....##...##....#
.#..#...#.#.#..#...#.#.#..#...X.#.#..#...#.#.#..#...#.#.#..#...#.#  --->

In this example, traversing the map using this slope would cause you to encounter 7 trees.

Starting at the top-left corner of your map and following a slope of right 3 and down 1, how many trees would you encounter?

Importing the data

Every Advent of Code participant gets their own set of data. I copied my data and went through my usual process of bringing it into Python. This involves pasting it into a triple-quoted string and assigning it to the variable raw_input.

raw_input = """...#...###......##.#..#.....##.
..#.#.#....#.##.#......#.#....#
......#.....#......#....#...##.
...#.....##.#..#........##.....
...##...##...#...#....###....#.
...##...##.......#....#...#.#..
..............##..#..#........#
#.#....#.........#...##.#.#.#.#
.#..##......#.#......#...#....#
#....#..#.#.....#..#...#...#...
#.#.#.....##.....#.........#...
......###..#....#..#..#.#....#.
##.####...#.............#.##..#
....#....#..#......#.......#...
...#.......#.#..#.........##.#.
......#.#.....###.###..###..#..
##..##.......#.#.....#..#....#.
..##.#..#....#.............##.#
....#.#.#..#..#........##....#.
.....####..#..#.###..#....##..#
#.#.......#...##.##.##..#....#.
.#..#..##...####.#......#..#...
#...##.......#...####......##..
...#.####....#.#...###.#.#...#.
....#...........#.##.##.#......
.....##...#.######.#..#....#..#
.#....#...##....#..######....#.
...#.....#...#####.##...#..#.#.
.....#...##........##.##.##.###
#.#..#....##....#......#....#.#
......##...#.........#....#.#..
###..#..##......##.#####.###.##
#.....#..##.##....#...........#
##..#.#..##..#.#.....#......#..
.#.##.#..#.#....##..#..#....#..
.#......##..##.#...#..#.......#
#....##.##..###..###......##.#.
....###..##.......#.###.#....#.
..##........#........##.....#..
.#..#.....#...####.##...##.....
....#.#.#.....#.##..##.....#..#
..............#.....#...#.....#
.#.....#......###...........#.#
.....#.#......#.##..#..........
.#......###............#.#.##..
.#.#....##.#..###.....#.##..#.#
.......#.#.#..#..#..#...##..#.#
.#.###...##.#.#.####.#.#...#...
...#.#....#......##.##.#.......
#...#.....##....#........##....
.....###...#.##.#......##.#..#.
..#...##.##.###..#..#......####
.#.##.#..#.##..##..........##..
..#.#.#..#.#.....#...###.....#.
..#..#.#....#.##.............##
.......#..###..#.#...#.....##.#
####.#.#......#..#.##.........#
..........#.....#..##......###.
..#..............#...#..##.....
......#.#.#..#.##.....####..##.
.##.#..#.#....#.......#..#.....
..#..#..#.##.#....###.#.#.#.#.#
.....#....#......###..#........
#.#.#..#...###.....#......#.##.
...#.#....#.#......#........#..
..#...###.#...#..#....##...#..#
.###.##..#..#...###.#..#.####..
#....#..##..##..#......#...##..
#.#..#...#..#...###..#.#.##....
##....#....##.####...#.#.###...
##.#...#.......#.##.##....#...#
..#.#..........#..#.#.#....#..#
..#........#...#....#....#....#
..#..#....#.......#........#...
......#....#....##.#....#.#.##.
.##...###.##.##....##.#...###..
.....##..#.#.....###..#.....###
....##.#.##...##..##........#..
#...#..##.#.#....#......#...#..
.###.##.#........#.####....#...
#.##.....#..#...#..##.##..#.#..
.....#.#..#....#..#...##.##.#..
.#......#####...##...#.#.###.#.
#......##....#.....#......##.#.
#.#.##.###.#......#####..#.....
........###.#...#..#.#........#
....#....###..#.##.#...#....#..
..........#..#.#....#...#.#...#
#.##......###.#.#.#....####...#
...#.#....#........##.#.....##.
.....##..#.#.#..###...##...#...
#...#...#....#....##........#..
.....#.........##.#......#..#..
#.....##..#.###.....#....##.##.
...#..#..#.#........##...##.#.#
..#..##.###.....#.#.....###.##.
..###.........#...##.....###...
...###...##.#...##....##.......
.#.#..#...###..#.#....#....#...
##..#.......#....#.#...#..#..#.
..#......#....####..##..#.###.#
..#.......##........#.#.#..#...
.#.......#.##.#.##.#.......#..#
###...#...#...#...#..#...#...##
..#..........#..###........##..
.##..#....##......##........#.#
......#.##......#......##...#.#
.#.#....#.#.#........#......#..
.#.#..#....####..##...##....#..
.#...##..#..#..#####....##.#...
.##.#.#...#...#.#...#.##.#...#.
###.#...##..#.###.#.....#.##.#.
#.....#.###.#.#...#..#....#.#..
..##..#....#......#.........###
.#...#...##......#...#.####....
..#.##...##..............#.#..#
..#......#..##...........#..#.#
..#####...#..#.......##...###..
..##..#....#.#...###.#...#.....
..#....#..#.#.......#..#.#.#...
.##..#.#.....##....#.......#...
...#.#..###...##....#....##..#.
.....##..#...##.####....##...#.
.......#.........#...#.##..####
........###..#..#.##.###..#....
.#.#..#.##.##.........#...#....
.###......#.....#....##....####
.##..##...........#.....#.....#
#.#.#.#.#.#.##..#.#.#..#.##....
.##.##...##..#....##..###..####
#..##.#......#...###.........#.
#..#...#..##..#..##.....##...#.
#...##..#...##.#.###.#...#.....
.###.....#.......#...#.##.###.#
..........#.#......#...###...##
..##....#.#..#....#.###...#..##
#.#..#....##.##..##.........##.
#.....#.##.###.#..#....##...#..
...#........##...#..###..######
#..#.#.#.#...#....#....###.#..#
...##.##.##.....##..#........#.
..#.#....#..#.......#...##.##.#
###.##.......##..#.####...#.#..
....#.#.....##.....#.#.##...#..
.#..##..#.....#.#..#...#..#..#.
.###....##...#......#.....#....
##.##.###......#...#...###.#...
#...#.##...#.#....##.....####..
#.#.#.#...###...##.............
..#....#.....##.#...#......#...
....#...#......#...#..#...#.#..
.###..#.....#..#...#....#...#..
..#...#.#..###.......#..#.#...#
#...###.##.....#....#..#.#..##.
...#.##.#.##......#.#.#.##.....
........####...##...##..#....#.
.#.#....#....#.#...##.###...##.
#.#...###.#.#.#....#....#.#....
.####.#..#.#....#..#.#..#..#...
#####...#...#...#....#.#.#..##.
..###.##.###...#..........#.##.
##.....#...#....###..###.#.#.#.
#..##.#..#..#..#...#.#...###..#
##....#.#...##......#.#...#...#
.##..........#.#....#...#.##..#
....#....####.#.####......#.###
..##.#.....####.#.####.......#.
#.##.##.#.....#.##......##...#.
......###..#.....#.....##......
..#..#....#.#...#.....#......##
##..#..#..###.#.#..#..##.....#.
#....#.#.....#####...#.#.......
.....#..#....#.#.#..#...#...#..
............#.##......##.##.#.#
....#...#.#........###....#....
..#..#...###.#....##..#..###.##
#.##....#...#.....##..#.##.#...
...##..###...#.#.....##.#......
.#..#.##.#####..#.......#..###.
...#.##...###.....####.#.#.....
.#......##.#.#.#.#.##.#.....#..
..#.##.#..##.......#.#.....##..
..................#....#...#...
.##.#..#.#.#..#.......#.#..##.#
....#........#......#..####..#.
#...#...##..##.#..#.......##...
#..#..#..#..#........####..#.#.
..#.#......#..#.##.##.#.#...#.#
...#..#......#.#.###.#.##..##..
..#...##.....#..#...##..#.#....
#.........#....#..#....##.##..#
..#..#.#....#..#...#.##.....#..
...#..#...#.#.....#..##..#.#...
....#........#.#....##..##..#..
#.....#.#..#.......#.##.....#..
.#.###.###...##...##..###...#..
.##.##.......#.#......#.....#.#
...#....##....#..#..........#.#
..#.##.........#.#.....#.....#.
...#.##..##.......##..##...#...
#.##......##.##..#.....##...##.
#.#.#..##...#.#............#.#.
....#.....#......##...#.#.....#
...#.#......#.#...###.......#..
......##..###....#.#...#.#####.
..#..#.#.#...##.#...###..##..#.
##.##.#.#.##.#..#....#...#...#.
#..#....######.##.#...#...#.#..
.#..#.##....#..#.#.......#....#
....#....#......##....##.#.#...
.###......#..#..#.......####..#
.#.#.....#..###...........##...
.##..##.##.....####..#..#..##..
..#..##.#......#...###.##..#.#.
....##..#.....###..#.##....##.#
#..#......#....#.........#.....
.#...#.....#.#..#..##....#.....
.##..#...#..##.#..#...........#
..#..##........##.......#..#...
#.....#....#....#.#.#...##.#...
...#...#.#.#..#.##.#.#...#.....
..#..#.#....#....###....#.#.#..
...###..#...#..#....#.....#....
...#...#.#..#.....#...###......
##......#..........#.#.#..#.#.#
....#.....#.....#..#..#.#.#.#..
...####...#.##...#.#..#....#.#.
#.##........##.............#.##
#.#.#.#.#.....................#
.#.###....#..##.##.##....#.....
#.#...#.####.###...#..#..#.#...
.##...#..###.......##..#.#.....
#.#.#.#...#...#.##.....#.......
.##.#.#.#......####..#.#.......
###..........#......#...##...#.
.........##...#.##...#.#.......
...#.#.....#...#..#...#..##..#.
.#..###...#.#.........###....#.
##..#...#........#.........##..
.....#...#.#...#.#.#...........
..#....##...#.#..#..#.##....##.
.##....#.#.....##.#..#..#...##.
..##......#.#...#.#.......##.#.
##...#..#...##.#........#.##...
#......#.##..#.#..#.###.......#
#.#...#.....#.#......#.#.#.....
#.....#..#.......#....##.#.#..#
###.#....#..##.#.##....#....#..
#.##.##....#.#..#.#...#....#...
####...#####...#.....#....##.#.
....#.#...#.#.##.#.#.##.#.#.###
#.....##.#.....#.#......#.#..#.
.#....##.#..#........#...##.#..
......#...#....##....##....##..
..###.....#....##.#...#..#.....
#....##...##.##..##.#...#...#..
#..#...#...#.#....##..#.#....#.
......................#.....#..
.#..#...#.........#....##...###
.##.#.#...##...#...#...#..#....
.#.###....#.#............##..#.
###..##.#.#.#.#....##...#......
.##................####...##.##
.#.#.........##....#.#.##.##.#.
....#...#...#...##...#.##.#..#.
.#.#........#..##.....#..#....#
##.#..#.#....#.....#...#...#...
.#...##....#.....##....#..#.#.#
####.....#..#....#......###.##.
##..##.#....###.....#...#......
.##.#...#.....#.#..#.#..#.#...#
.....#.#..#..#..###.#...###.#..
.#.#.##.#..#.#..#...#..#.......
..#.....#....#.##.##.##.......#
.#..##....###...#..............
#....#...#.#.......#....##.#...
....#.#..##.#....#..#.#....#.#.
#.........##...#.#.............
#.#.......##.....#...##..##.#.#
.......#....#..##...#..#######.
.#.#...##........#..#.....#.#..
.#.......#..#......#.##.##...##
.........#............#....#..#
.#......#...##...##...#....###.
.........#...#.#.###.......#...
###.#..#.#.#.#......##...#.#...
.#.........##.#....###....#.#..
#.#....#..#.##.#..#....##...#..
###.#...#..#..##........#.###..
.....#....#..#........#..#.##.#
..#.....##......#....#..#.#.#..
.#.........#.....#.....#.......
......#....#.###..#.#....#....#
..#.#..#.#.###.........#..#..#.
..#..#.#.#.........#....##.#.#.
#.......#........##...##....#..
##..#..#...###...#..##..#..#.#.
##..#..#....#.#..##..#..#.#..#.
..##.....##.#..#.#........###..
..#.#..#..##........#...#....##
.##..#....##..#..#..#..#.#....#
#....#.....##........#.....#.##
......#....#.#..#......#.##....
.......#..#..#......##.........
......#...#..##.....#......#..#
#..#..#....##....#........##..#
##....#...#.#.....#####........
...#.#..#.#.##.#.##..##...#....
..#..#..#..#..#....#..#..##...#
.#.....#....##.##....##.....#..
#...#.....#.....#.#...#.#....#.
.###...#..##....#..#...#.###...
....#..##..#.......#.##.##..###
#.......##.....#.......#.#...##
#.....#.#.#....#.#......#.#.#..
..##.....#..###......##........
.....#...#..##....#......#.....
#..#..#....#.#...#..###.......#
.....#.....#....#..#...#.#..##.
#####........#...#..#..##..#.#.
.#..#...#.##....#.#..#......###
#.###.#..#.....##..##....#...#.
.#...#.#####....##..........##."""

I then split() the string into a list, using the newline character as the delimiter. I named the resulting list map_basis, since it’s the basis for a complete map of the hill:

map_basis = raw_input.split("\n")

I did a quick len(map_basis) check to see how long a list I was dealing with. It had 323 items.

Strategy

Looking at the question, it became clear to me that the most important problem was to come up with the answer to this question:

Given a set of coordinates, is there a tree at that location?

First, let’s consider the coordinate system of the problem. It’s not unlike screen coordinates, with the origin — (0, 0) — located at the upper left corner of the screen. x increases as you go right, and y increases as you go down:

All the strings in map_basis are 31 characters wide, and the actual map repeats itself in the x-direction starting at character index 31. This means that for any given line:

  • The character at index 31 is the same as the character at index 0.
  • The character at index 32 is the same as the character at index 1.
  • The character at index 33 is the same as the character at index 2.
  • The character at index 34 is the same as the character at index 3.
  • And so on…

This means that for any x-coordinate on the actual hill (let’s call it x_hill_coordinate), its corresponding x-coordinate on the map (let’s call it x_map_coordinate) can be defined by:

x_map_coordinate = x_hill_coordinate mod 31

The map doesn’t repeat itself in the y-direction. Any y-coordinate on the actual hill has the same corresponding y-coordinate on the map.

With that in mind, I defined this function:

def is_tree_at_coordinates(hill_x, hill_y):
    map_x = hill_x % 31
    return map_basis[hill_y][map_x] == "#"

This function should return True if and only if there is a tree at the coordinates (hill_x, hill_y).

Going down the hill

Now that I had a function that could tell me where the trees were, it was time to go down the hill! I wrote this function, which takes arguments for rightward and downward movement for each “step”. It then “travels” down the hill, counting trees along the way:

def tree_count_for_slope(right_increment, down_increment):
    right_coordinate = 0
    down_coordinate = 0
    tree_count = 0

    while down_coordinate < len(map_basis):
        if is_tree_at_coordinates(right_coordinate, down_coordinate):
            tree_count += 1
        right_coordinate += right_increment
        down_coordinate += down_increment

    return tree_count

For my input data, the tree count was 230.

The Day 3 challenge, part two

The challenge

Here’s the text of part two:

Time to check the rest of the slopes – you need to minimize the probability of a sudden arboreal stop, after all.

Determine the number of trees you would encounter if, for each of the following slopes, you start at the top-left corner and traverse the map all the way to the bottom:

  • Right 1, down 1.
  • Right 3, down 1. (This is the slope you already checked.)
  • Right 5, down 1.
  • Right 7, down 1.
  • Right 1, down 2.

In the above example, these slopes would find 2734, and 2 tree(s) respectively; multiplied together, these produce the answer 336.

What do you get if you multiply together the number of trees encountered on each of the listed slopes?

In completing part one, I had also completed the crucial piece of part two! Let this be a lesson to Advent of Code participants: Creating a good data structure or interface for the input data will make coming up with the answers that much easier.

The solution was simply to plug in the values above into my tree_count_for_slope() function and multiply the results together:

print(
    tree_count_for_slope(1, 1) * 
    tree_count_for_slope(3, 1) * 
    tree_count_for_slope(5, 1) * 
    tree_count_for_slope(7, 1) * 
    tree_count_for_slope(1, 2)
)

This gave me the solution for my data: 9533698720.

And with that, I had completed Day 3!

If you have any questions, feel free to post them in the comments.

Solutions for other days in Advent of Code 2020

Solutions for other days in Advent of Code 2020

Categories
Programming What I’m Up To

My solution to Advent of Code 2020’s Day 2 challenge, in Python

Welcome to another installment in my Advent of Code 2020 series, where I present my solutions to this year’s Advent of Code challenges!

For those of you not familiar with Advent of Code, here’s a quick description, taken straight from their “About” page…

Advent of Code is an Advent calendar of small programming puzzles for a variety of skill sets and skill levels that can be solved in any programming language you like. People use them as a speed contestinterview prepcompany traininguniversity courseworkpractice problems, or to challenge each other.

You don’t need a computer science background to participate – just a little programming knowledge and some problem solving skills will get you pretty far. Nor do you need a fancy computer; every problem has a solution that completes in at most 15 seconds on ten-year-old hardware.

Advent of Code has been around since 2015, and each year, its puzzles have all been centered around a “Save Christmas” theme. Over the past five holidays seasons, programmers all over the world have solved problems using code in order to:

This year’s puzzles are about saving the well-earned vacation you’re taking, after having saved Christmas five years in a row. Day 1’s puzzles had you fix an expense report that you had to deal with before you could go on vacation, and I posted my Python solution yesterday.

Spoiler alert!

Please be warned: If you want to try solving the challenge on your own and without any help, stop reading now! The remainder of this post will be all about my solution to both parts of the Day 2 challenge.

The Day 2 challenge, part one

Meme: When you remember the password to your account on the first try, featuring Fat Thor holding his hammer and saying “I’m still worthy!”

Here’s the text from part one of the challenge:

Day 2: Password Philosophy

Your flight departs in a few days from the coastal airport; the easiest way down to the coast from here is via toboggan.

The shopkeeper at the North Pole Toboggan Rental Shop is having a bad day. “Something’s wrong with our computers; we can’t log in!” You ask if you can take a look.

Their password database seems to be a little corrupted: some of the passwords wouldn’t have been allowed by the Official Toboggan Corporate Policy that was in effect when they were chosen.

To try to debug the problem, they have created a list (your puzzle input) of passwords (according to the corrupted database) and the corporate policy when that password was set.

For example, suppose you have the following list:

1-3 a: abcde
1-3 b: cdefg
2-9 c: ccccccccc

Each line gives the password policy and then the password. The password policy indicates the lowest and highest number of times a given letter must appear for the password to be valid. For example, 1-3 a means that the password must contain a at least 1 time and at most 3 times.

In the above example, 2 passwords are valid. The middle password, cdefg, is not; it contains no instances of b, but needs at least 1. The first and third passwords are valid: they contain one a or nine c, both within the limits of their respective policies.

How many passwords are valid according to their policies?

Importing the data

Every Advent of Code participant gets their own set of data. I copied my data and went through the usual process of bringing it into Python by pasting it into a triple-quoted string and assigning it to the variable raw_input.

I then split the string into an array, using the newline character as the delimiter. I named the array split_input.

Here’s the code, with the data abridged for brevity:

raw_input = """3-5 f: fgfff
6-20 n: qlzsnnnndwnlhwnxhvjn
6-7 j: jjjjjwrj
8-10 g: gggggggggg
5-6 t: ttttttft
6-11 h: khmchszhmzm
4-6 q: qqbjqqqj

...

3-6 h: hdhjhhhhchh
11-12 r: zrrkcrrrrrlh
7-9 v: vhqvlvwvzqwqvrxvjnf
1-5 r: rvmjr"""

split_input = raw_input.split("\n")

Formatting the data

One of the best things you can do while taking on an Advent of Code puzzle is to convert the data set you’re given into a format that will make it easier to solve the problem. The second part of the puzzle is usually based on the same data, and having it already in a helpful format will save you a lot of time.

With that in mind, my next step was to define a function that would convert each line in split_input into a dictionary. For example, when given the following line…

6-11 h: khmchszhmzm

…the function would produce the following dictionary:

{
"min": 6,
"max": 11,
"char": h,
"password": khmchszhmzm
}

Here’s the function:

def convert_to_password_and_policy_dict(line):
    password_and_policy_dict = {}
    password_and_policy_list = line.split()
    
    min_max = password_and_policy_list[0].split("-")
    password_and_policy_dict["min"] = int(min_max[0])
    password_and_policy_dict["max"] = int(min_max[1])
    
    password_and_policy_dict["char"] = password_and_policy_list[1][0]
    
    password_and_policy_dict["password"] = password_and_policy_list[2]
    
    return password_and_policy_dict

I used convert_to_password_and_policy_dict() in a list comprehension to convert split_input into a list of “password and policy” dictionaries named passwords_and_policies:

passwords_and_policies = [convert_to_password_and_policy_dict(password_and_policy) for password_and_policy in split_input]

Here’s a peek at passwords_and_policies’ contents:

>>> passwords_and_policies
[{'min': 3, 'max': 5, 'char': 'f', 'password': 'fgfff'},
 {'min': 6, 'max': 20, 'char': 'n', 'password': 'qlzsnnnndwnlhwnxhvjn'},
 {'min': 6, 'max': 7, 'char': 'j', 'password': 'jjjjjwrj'},
 {'min': 8, 'max': 10, 'char': 'g', 'password': 'gggggggggg'},
 {'min': 5, 'max': 6, 'char': 't', 'password': 'ttttttft'},

...

 {'min': 6, 'max': 12, 'char': 'g', 'password': 'dmgggpgggwczggghggm'},
 {'min': 3, 'max': 6, 'char': 'h', 'password': 'hdhjhhhhchh'},
 {'min': 11, 'max': 12, 'char': 'r', 'password': 'zrrkcrrrrrlh'},
 {'min': 7, 'max': 9, 'char': 'v', 'password': 'vhqvlvwvzqwqvrxvjnf'},
 {'min': 1, 'max': 5, 'char': 'r', 'password': 'rvmjr'}]

I then wrote a function that takes a “password and policy” dictionary and returns True if the dictionary’s password meets its policy:

def meets_password_policy(password_and_policy):
    char_count_in_password = password_and_policy["password"].count(password_and_policy["char"])
    return password_and_policy["min"] <= char_count_in_password <= password_and_policy["max"]

I used that function as the criteria for a filter() to create a list of only “password and policy” dictionaries whose passwords met their policies:

valid_passwords = list(filter(meets_password_policy, passwords_and_policies))

The solution to the first puzzle is the number of dictionaries in the resulting list, valid_passwords:

>>> len(valid_passwords)
564

I entered this result and completed the first challenge.

The Day 2 challenge, part two

Meme: Sorry, but your password must contain an uppercase letter, a number, a hieroglyph, a feather from a hawk, and the blood of a unicorn.

Here’s the text from part two of the challenge:

While it appears you validated the passwords correctly, they don’t seem to be what the Official Toboggan Corporate Authentication System is expecting.

The shopkeeper suddenly realizes that he just accidentally explained the password policy rules from his old job at the sled rental place down the street! The Official Toboggan Corporate Policy actually works a little differently.

Each policy actually describes two positions in the password, where 1 means the first character, 2 means the second character, and so on. (Be careful; Toboggan Corporate Policies have no concept of “index zero”!) Exactly one of these positions must contain the given letter. Other occurrences of the letter are irrelevant for the purposes of policy enforcement.

Given the same example list from above:

  • 1-3 a: abcde is valid: position 1 contains a and position 3 does not.
  • 1-3 b: cdefg is invalid: neither position 1 nor position 3 contains b.
  • 2-9 c: ccccccccc is invalid: both position 2 and position 9 contain c.

How many passwords are valid according to the new interpretation of the policies?

Since I already had the data in a nice, usable format, solving part two of the puzzle was easy. I simply wrote a new function that takes a “password and policy” dictionary and returns True if the dictionary’s password meets the policy described in part two:

def meets_new_password_policy(password_and_policy):
    first_position = password_and_policy["min"]
    char_in_first_position = password_and_policy["password"][first_position - 1] == password_and_policy["char"]
    
    second_position = password_and_policy["max"]
    char_in_second_position = password_and_policy["password"][second_position - 1] == password_and_policy["char"]

    return char_in_first_position ^ char_in_second_position

Note the return statement. While Python has the and and or keywords for logical and and or, it uses the ^ character for logical exclusive or.

I used that function as the criteria for a filter() to create a list of only “password and policy” dictionaries whose passwords met their policies, according to the new rules:

new_valid_passwords = list(filter(meets_new_password_policy, passwords_and_policies))

The solution to the second puzzle is the number of dictionaries in the resulting list, new_valid_passwords:

>>> len(new_valid_passwords)
325

Upon entering that result, the Day 2 challenge was complete!

Other days’ solutions:

Here are my solutions for other days in Advent of Code 2020:

Categories
Programming What I’m Up To

My solution to Advent of Code 2020’s Day 1 challenge, in Python

 

December has arrived, and so has the great programming exercise known as the Advent of Code!

Think of it as an Advent calendar, but chocolates (or cheese, or wine), you’re presented with a new programming puzzle every day between the start of December and Christmas Day, in which you try to save Santa’s mission. You can use whatever programming language you want, and you don’t need to be an expert — as the site says, “just a little programming knowledge and some problem solving skills will get you pretty far.”

Advent of Code started in 2015, and has been taking place every year ever since. The 2020 edition began on Tuesday, December 1st at 12:00 midnight Eastern time (UTC-5).

Not only do I plan on participating in this year’s Advent of Code, but I might even use a couple of the challenges in the Python class I’m currently teaching on behalf of Computer Coach.

You have to sign in to play

In order to take on Advent of Code’s challenges, you have to sign in using an account from one of these popular “federated ID” services:

  • Github
  • Google
  • Twitter
  • Reddit

This is for a couple of reasons:

  • Signing in makes it easier for the site to keep track of your progress. Advent of Code is structures so that you must successfully complete challenge n before taking on challenge (n+1).
  • While everyone has to solve the same problem, each user gets their own (presumably) unique data set.

Once you’ve signed in, you can start on the first challenge…

Spoiler alert!

Please be warned: If you want to try solving the challenge on your own and without any help, stop reading now! The remainder of this post will be all about my solution to both parts of the Day 1 challenge.

The Day 1 challenge, part one

Here’s the text from part one of the challenge:

Day 1: Report Repair

After saving Christmas five years in a row, you’ve decided to take a vacation at a nice resort on a tropical island. Surely, Christmas will go on without you.

The tropical island has its own currency and is entirely cash-only. The gold coins used there have a little picture of a starfish; the locals just call them stars. None of the currency exchanges seem to have heard of them, but somehow, you’ll need to find fifty of these coins by the time you arrive so you can pay the deposit on your room.

To save your vacation, you need to get all fifty stars by December 25th.

Collect stars by solving puzzles. Two puzzles will be made available on each day in the Advent calendar; the second puzzle is unlocked when you complete the first. Each puzzle grants one star. Good luck!

Before you leave, the Elves in accounting just need you to fix your expense report (your puzzle input); apparently, something isn’t quite adding up.

Specifically, they need you to find the two entries that sum to 2020 and then multiply those two numbers together.

For example, suppose your expense report contained the following:

1721
979
366
299
675
1456

In this list, the two entries that sum to 2020 are 1721 and 299. Multiplying them together produces 1721 * 299 = 514579, so the correct answer is 514579.

Of course, your expense report is much larger. Find the two entries that sum to 2020; what do you get if you multiply them together?

Here are the expense numbers that were provided for my account:

1140
1736
1711
1803
1825
1268
1651
2007
1923
1661
1788
1876
2003
1752
1988
1955
1568
1478
1699
1717
1828
1636
1387
1870
1658
1572
1703
1185
1569
1515
1142
1407
1587
1608
1827
1546
1808
1937
1815
1957
1401
1763
1970
1960
1853
1987
1865
1567
1664
1961
1771
1846
1971
1416
1897
633
1708
1606
515
1397
1873
1374
1969
1918
1170
1660
1494
1764
2002
1938
1396
1926
1714
1659
1805
1593
1899
1850
1644
1877
1561
1895
1985
1353
395
1919
1522
1745
1721
901
1765
1939
2009
1949
1852
1792
1749
1675
1883
1240
1868
1615
1693
1720
1388
1325
1337
867
1751
1408
1715
1942
1706
1894
1260
1945
1700
1148
1373
351
1790
1861
1755
1155
1622
1743
1872
1979
1262
1789
1305
1311
1729
1929
823
1623
2005
1932
1814
1909
1728
1592
1712
1363
1338
1804
1402
1198
264
1117
1791
1419
1229
1924
1838
1785
1982
1683
1950
1199
1984
1830
1921
1980
1834
1341
1282
1989
1854
1395
1847
1900
1913
1777
1779
1333
1800
1966
1543
1882
1375
1811
1673
1679
889
1670
1879
1312
1741
1772
1663
1776
1642
1674
1472
1580
1264
1738
1999
1637

I decided to use a Jupyter notebook running a Python kernel solve the problem.

Importing the data

My first step was to copy the numbers above, paste them into a triple-quoted string, and assign that string to the variable raw_input:

raw_input = """1140
1736
1711
1803
1825
1268
1651
2007
1923
1661
1788
1876
2003
1752
1988
1955
1568
1478
1699
1717
1828
1636
1387
1870
1658
1572
1703
1185
1569
1515
1142
1407
1587
1608
1827
1546
1808
1937
1815
1957
1401
1763
1970
1960
1853
1987
1865
1567
1664
1961
1771
1846
1971
1416
1897
633
1708
1606
515
1397
1873
1374
1969
1918
1170
1660
1494
1764
2002
1938
1396
1926
1714
1659
1805
1593
1899
1850
1644
1877
1561
1895
1985
1353
395
1919
1522
1745
1721
901
1765
1939
2009
1949
1852
1792
1749
1675
1883
1240
1868
1615
1693
1720
1388
1325
1337
867
1751
1408
1715
1942
1706
1894
1260
1945
1700
1148
1373
351
1790
1861
1755
1155
1622
1743
1872
1979
1262
1789
1305
1311
1729
1929
823
1623
2005
1932
1814
1909
1728
1592
1712
1363
1338
1804
1402
1198
264
1117
1791
1419
1229
1924
1838
1785
1982
1683
1950
1199
1984
1830
1921
1980
1834
1341
1282
1989
1854
1395
1847
1900
1913
1777
1779
1333
1800
1966
1543
1882
1375
1811
1673
1679
889
1670
1879
1312
1741
1772
1663
1776
1642
1674
1472
1580
1264
1738
1999
1637"""

Now that I had the data in a string, I could split the string into an array, using the newline character as the delimiter. I named the array split_input:

>>> split_input = raw_input.split("\n")
>>> split_input
['1140', '1736', '1711', '1803', '1825', '1268', '1651', '2007', '1923', '1661', '1788', '1876', '2003', '1752', '1988', '1955', '1568', '1478', '1699', '1717', '1828', '1636', '1387', '1870', '1658', '1572', '1703', '1185', '1569', '1515', '1142', '1407', '1587', '1608', '1827', '1546', '1808', '1937', '1815', '1957', '1401', '1763', '1970', '1960', '1853', '1987', '1865', '1567', '1664', '1961', '1771', '1846', '1971', '1416', '1897', '633', '1708', '1606', '515', '1397', '1873', '1374', '1969', '1918', '1170', '1660', '1494', '1764', '2002', '1938', '1396', '1926', '1714', '1659', '1805', '1593', '1899', '1850', '1644', '1877', '1561', '1895', '1985', '1353', '395', '1919', '1522', '1745', '1721', '901', '1765', '1939', '2009', '1949', '1852', '1792', '1749', '1675', '1883', '1240', '1868', '1615', '1693', '1720', '1388', '1325', '1337', '867', '1751', '1408', '1715', '1942', '1706', '1894', '1260', '1945', '1700', '1148', '1373', '351', '1790', '1861', '1755', '1155', '1622', '1743', '1872', '1979', '1262', '1789', '1305', '1311', '1729', '1929', '823', '1623', '2005', '1932', '1814', '1909', '1728', '1592', '1712', '1363', '1338', '1804', '1402', '1198', '264', '1117', '1791', '1419', '1229', '1924', '1838', '1785', '1982', '1683', '1950', '1199', '1984', '1830', '1921', '1980', '1834', '1341', '1282', '1989', '1854', '1395', '1847', '1900', '1913', '1777', '1779', '1333', '1800', '1966', '1543', '1882', '1375', '1811', '1673', '1679', '889', '1670', '1879', '1312', '1741', '1772', '1663', '1776', '1642', '1674', '1472', '1580', '1264', '1738', '1999', '1637']

split_input is an array of strings which needed to be converted into integer values.

In many other languages, I’d do this by using the map function to apply a “convert a string to its integer value” function to every item in the array, creating a resulting array called expenses. Here’s the Python version of that approach:

>>> expenses = list(map(int, split_input))
>>> expenses
[1140, 1736, 1711, 1803, 1825, 1268, 1651, 2007, 1923, 1661, 1788, 1876, 2003, 1752, 1988, 1955, 1568, 1478, 1699, 1717, 1828, 1636, 1387, 1870, 1658, 1572, 1703, 1185, 1569, 1515, 1142, 1407, 1587, 1608, 1827, 1546, 1808, 1937, 1815, 1957, 1401, 1763, 1970, 1960, 1853, 1987, 1865, 1567, 1664, 1961, 1771, 1846, 1971, 1416, 1897, 633, 1708, 1606, 515, 1397, 1873, 1374, 1969, 1918, 1170, 1660, 1494, 1764, 2002, 1938, 1396, 1926, 1714, 1659, 1805, 1593, 1899, 1850, 1644, 1877, 1561, 1895, 1985, 1353, 395, 1919, 1522, 1745, 1721, 901, 1765, 1939, 2009, 1949, 1852, 1792, 1749, 1675, 1883, 1240, 1868, 1615, 1693, 1720, 1388, 1325, 1337, 867, 1751, 1408, 1715, 1942, 1706, 1894, 1260, 1945, 1700, 1148, 1373, 351, 1790, 1861, 1755, 1155, 1622, 1743, 1872, 1979, 1262, 1789, 1305, 1311, 1729, 1929, 823, 1623, 2005, 1932, 1814, 1909, 1728, 1592, 1712, 1363, 1338, 1804, 1402, 1198, 264, 1117, 1791, 1419, 1229, 1924, 1838, 1785, 1982, 1683, 1950, 1199, 1984, 1830, 1921, 1980, 1834, 1341, 1282, 1989, 1854, 1395, 1847, 1900, 1913, 1777, 1779, 1333, 1800, 1966, 1543, 1882, 1375, 1811, 1673, 1679, 889, 1670, 1879, 1312, 1741, 1772, 1663, 1776, 1642, 1674, 1472, 1580, 1264, 1738, 1999, 1637]

It works, but from a Python programming point of view, it just doesn’t feel right.

The Pythonic approach would involve using a list comprehension instead of map (and then using the resulting iterator into a list). It just seems more readable:

>>> expenses = [int(string) for string in split_input]
>>> expenses
[1140, 1736, 1711, 1803, 1825, 1268, 1651, 2007, 1923, 1661, 1788, 1876, 2003, 1752, 1988, 1955, 1568, 1478, 1699, 1717, 1828, 1636, 1387, 1870, 1658, 1572, 1703, 1185, 1569, 1515, 1142, 1407, 1587, 1608, 1827, 1546, 1808, 1937, 1815, 1957, 1401, 1763, 1970, 1960, 1853, 1987, 1865, 1567, 1664, 1961, 1771, 1846, 1971, 1416, 1897, 633, 1708, 1606, 515, 1397, 1873, 1374, 1969, 1918, 1170, 1660, 1494, 1764, 2002, 1938, 1396, 1926, 1714, 1659, 1805, 1593, 1899, 1850, 1644, 1877, 1561, 1895, 1985, 1353, 395, 1919, 1522, 1745, 1721, 901, 1765, 1939, 2009, 1949, 1852, 1792, 1749, 1675, 1883, 1240, 1868, 1615, 1693, 1720, 1388, 1325, 1337, 867, 1751, 1408, 1715, 1942, 1706, 1894, 1260, 1945, 1700, 1148, 1373, 351, 1790, 1861, 1755, 1155, 1622, 1743, 1872, 1979, 1262, 1789, 1305, 1311, 1729, 1929, 823, 1623, 2005, 1932, 1814, 1909, 1728, 1592, 1712, 1363, 1338, 1804, 1402, 1198, 264, 1117, 1791, 1419, 1229, 1924, 1838, 1785, 1982, 1683, 1950, 1199, 1984, 1830, 1921, 1980, 1834, 1341, 1282, 1989, 1854, 1395, 1847, 1900, 1913, 1777, 1779, 1333, 1800, 1966, 1543, 1882, 1375, 1811, 1673, 1679, 889, 1670, 1879, 1312, 1741, 1772, 1663, 1776, 1642, 1674, 1472, 1580, 1264, 1738, 1999, 1637]

Now that I had the expenses in a Python list (that’s Pythonese for “array”), I could work with them.

Combinations to the rescue!

Once again, the goal of the challenge was to find the two numbers in the expense report whose sum was 2020.

To solve this problem, we need a way to generate all the possible combinations of two numbers taken from the list. I could write this code, but Python’s itertools module has a combinations() method that can do just that.

Here’s a quick demo of combinations() in action. Given a list containing a small number of integers, it generates a list of the possible 2-number combinations you can get from the list, without repetition (that is, a number can’t appear more than once in any combination):

>>> from itertools import *
>>> simple_list = [1, 3, 5, 7, 9]
>>> list(combinations(simple_list, 2))
[(1, 3), (1, 5), (1, 7), (1, 9), (3, 5), (3, 7), (3, 9), (5, 7), (5, 9), (7, 9)]

itertools also has a combinations_with_replacement() method. Rather than tell you what it does, let me show you:

>>> list(combinations_with_replacement(simple_list, 2))
[(1, 1), (1, 3), (1, 5), (1, 7), (1, 9), (3, 3), (3, 5), (3, 7), (3, 9), (5, 5), (5, 7), (5, 9), (7, 7), (7, 9), (9, 9)]

With that in mind, I used combinations() to generate a list of all the possible two-number combinations in expenses, which I assigned to a variable named all_expense_pairs:

>>> all_expense_pairs = list(combinations(expenses, 2))
>>> len(all_expense_pairs)
19900

Now that we have all the possible two-number combinations from the expense report, we can try to find the one(s) whose numbers add up to 2020.

Any time you’re in a situation where you need to find values in an array that match some criteria, you should think about applying a filter() function. I did just that: I used a filter() to extract a list of only those pairs summed to 2020…

def sums_to_2020(values):
    return sum(values) == 2020

>>> result = list(filter(sums_to_2020, all_expense_pairs))
>>> result
[(1387, 633)]

The resulting list had one tuple, (1387, 633), whose values sum to 2020. I entered the product of these two numbers — 877971 — and completed the first challenge.

The Day 1 challenge, part two

Here’s the text from part two:

The Elves in accounting are thankful for your help; one of them even offers you a starfish coin they had left over from a past vacation. They offer you a second one if you can find three numbers in your expense report that meet the same criteria.

Using the above example again, the three entries that sum to 2020 are 979366, and 675. Multiplying them together produces the answer, 241861950.

In your expense report, what is the product of the three entries that sum to 2020?

Had I solved the problem from first principles, the solution might have taken a lot of extra work. Thanks to the use of itertools.combinations(), the solution for part two took three lines of code:

>>> all_expense_triplets = list(combinations(expenses, 3))
>>> result2 = list(filter(sums_to_2020, all_expense_triplets))
>>> result2
[(867, 264, 889)]

Once again, the resulting list had one tuple, (867, 264, 889), and its values, added up, were 2020. I entered the product of these three numbers — 203481432 — and completed the second challenge.

Feeling simultaneously proud and soiled

Thanks to Python (and remembering that it had a library that could do combinations and permutations),  I made a personal best in solving the Day 1 puzzles. I’m pretty pleased, but at the same time, I did so little work that it feels as if I’ve cheated. I may have to try solving the problem from first principles — if I have the time.

Other days’ solutions:

Here are my solutions for other days in Advent of Code 2020:

Categories
What I’m Up To

My “Welcome to Auth0” swag

My Auth0 swag arrived today! This means I can finally partake in the techie tradition of making the traditional “Look at the stuff I got when I joined the company!” post.

The laptop arrived on the Friday of my first week, and it’s a nice one:

When I got hired, incoming Auziros — that’s the internal term for “Auth0 employee” — got the choice of either a 13″ or 16″ MacBook Pro.

Many developers I know prefer to go with a smaller, lighter notebook. As a person who carries an accordion to social events, conferences, and bars (or at least, I used to, before the plague), I have a distorted sense of what “lightweight” is, and consider a 16″ laptop dainty. I don’t mind the extra weight, and I appreciate the extra processing power, screen real estate, USB ports, and battery size.

The swag arrived in a box at noon, and most of it is Auth0-branded and in the official colors.

The goodies are:

  • 2 Auth0 t-shirts
  • 2 Auth0 stickers
  • A reversible booklet, which reads “Auth0 Product Vision” on one side, and “Auth0 Brand Vision” on the other
  • An Auth0 water bottle
  • A metal cup, labeled “One giant leap”, and below it, a lunar footprint with the Auth0 “shield” logo in the middle
  • An Auth0 spiral-bound notebook
  • An Auth0 laptop zip-pouch
  • A Tile Mate bluetooth tracker

The laptop pouch can hold the 16″ MacBook Pro, and it certainly stands out. It’s a good thing that orange is one of my favorite colors:

I’m going to have to ask an Auziro who’s been around longer what the “One Giant Leap” promotion was all about. It is a nice mug:

Companies function better when their people can actually tell the story of the company and articulate what the company’s all about. And for the people who work at a company, knowing the company’s vision and the image it wants to project to the world can help give a sense of meaning and purpose to the work they do.

That’s why I think one of the best things in the box o’ swag was the double-sided booklet, with Auth0’s product vision on one side…

…and Auth0’s brand vision on the other:

It’s not unusual for a tech company to provide swag like branded bottles, bags, mouse pads, mugs, and stickers. In my more recent experience, I’ve been fortunate to get a really nice “welcome” package from Shopify, Smartrac, and Sourcetoad.

Some companies stand out by providing something a little more unusual with the welcome swag. Auth0 is one of those companies, as they didn’t just include a Tile Bluetooth tracker, but also put the box in a sleeve with nice messages. They could’ve just thrown it in with the rest of the stuff, but they took the trouble to make it a little more personal:

Thanks for the sweet stuff, Auth0!

Categories
How To What I’m Up To

How to downgrade to macOS Catalina after upgrading to Big Sur

I’ll admit it: I’ve gotten a little used to working at smaller companies, where there’s no monitoring of company computers, and it’s the Wild West as far as what you can install on them.

That’s no longer the case for me. I now work at Auth0, a company with a headcount that’s quickly approaching 800, with unicorn status and Series F funding, and it’s in the security industry. Naturally, there’s a full-fledged security team that monitors company-issued computers.

In my excitement to take the new version of macOS — Big Sur — out for a spin, I’d forgotten that the Security team hasn’t yet approved it for use. They very quickly (and I should add, nicely) contacted me and let me know that I needed to reinstall macOS Catalina as soon as possible.

There are other reasons why you might need to go back to Catalina after installing Big Sur:

For the benefit of any who need to downgrade, here’s a step-by-step guide to reinstalling Catalina after you’ve installed Big Sur. You’ll need a USB key and the better part of an afternoon.

Step 1: The preliminaries

1a: Start downloading the Catalina installer from the App store

The first thing you’ll need is the macOS Catalina installer.

Here’s the link to the Catalina installer on the App Store.

It’ll take up around 9 gigabytes of space on your hard drive, and the App Store will put in your Applications folder.

Once it’s completely downloaded from the App Store, the installer will start automatically. When this happens, close the installer. You’ll make use of it later.

The installer will take some time to download. Apple’s servers will be busier than usual, as many users are downloading Big Sur and other upgrades.

1b: Back up your files!

In the process of reinstalling Catalina, you’ll need to completely erase your Mac’s hard drive. If you have any files that you can’t live without, this is the time to back them up.

I didn’t have to worry about this, since:

  • All my work product is either code (which lives on GitHub) or content (which lives on GitHub or Google Docs), and
  • I’ve been at Auth0 less than a month, and between onboarding and offsites, there just hasn’t been that much of a chance for me to accumulate that many files on my hard drive!

1c: Get a nice fast USB key that stores at least 16 GB

The process will involve booting your Mac from a USB key containing the macOS Catalina installer, so you’ll need a key with enough space. An 8 GB USB key won’t be big enough. Because digital storage is all about powers of 2, the next size up will be 16 GB.

I strongly recommend that you use a USB 3 key, especially one with read speeds of 300 megabits/second or better, such as the Samsung Fit Plus. Doing so will greatly speed up the process. Don’t use a USB key that you got as conference swag — it may have the space, but more often than not, they tend to be slow, because they’re cheap.

If the USB key contains files that you want to keep, back them up. You’re going to erase the key in the next step.

Step 2: Make a bootable USB key containing the macOS Catalina installer

2a: Format the USB key

Plug the USB key into your Mac, then launch Disk Utility.

Select the USB key in Disk Utility’s left column, then click the Erase button:

Tap to view at full size.

You’ll be presented with this dialog box:

Enter MyVolume into the Name field, and for Format, select Mac OS Extended (Journaled). Click the Erase button. This will format the USB key with the volume name of MyVolume.

2b: Install the macOS Catalina installer onto the USB key

In Step 1a, you downloaded the macOS Catalina installer and closed it after it started automatically. In this step, you’ll transfer it to your freshly-formatted USB key.

Open a terminal window and paste the following command into it:

sudo /Applications/Install\ macOS\ Catalina.app/Contents/Resources/createinstallmedia --volume /Volumes/MyVolume

(The command above assumes that you gave the USB key the volume name MyVolume.)

Once you’ve provided sudo with your password, you’ll be asked if you want to erase the USB key. Entering Y in response will start the process of making the USB key a bootable drive and copying the macOS Catalina installer onto it:

Tap to view at full size.

The Erasing disk process will be relatively quick, but the Copying to disk process may take a while. This is where using a nice, fast USB 3 key will pay off.

Be patient and let it get to 100%, and wait for the Install media now available message to appear and the command line prompt to return.

2c: If your Mac is from 2018 or later, set it up to boot from external media

Check the year of your Mac’s manufacture by selecting About This Mac under the Apple menu:

  • If your Mac year is 2017 or earlier, you don’t need to follow the rest of this step. Proceed to Step 3.
  • If your Mac’s year is 2018 or later, you’ll need to change its security settings to allow it to boot from an external drive.

Here’s how you change the security settings:

  1. Restart your Mac and hold down the and R keys when you see the Apple logo. This puts the computer into recovery mode, which provides many setup options.
  2. In the menu bar, select Utilities, and then select Startup Security Utility from the list that appears.
  3. The Startup Security Utility window will appear:
    1. Under the Secure Boot section, select Medium Security. This will allow you to install Catalina without having to connect to a network.
    2. Under the External Boot section, select Allow booting from external media. This will allow you to install Catalina from a USB key or disk drive.
Tap to view at full size.

Step 3: Install macOS Catalina

Restart your Mac, and hold down the Option key while it restarts. Your Mac will present you with a choice of startup disks.

Choose the USB key. Your Mac will boot up and you’ll be presented with the macOS Catalina installer screen:

Go ahead and install Catalina.

Once Catalina is installed, you can proceed reinstalling your other software.

Once that’s complete:

  • If your Mac’s year is 2017 or earlier, you’re done installing Catalina. You can now go about reinstalling your software and  restoring your backed up files.
  • If your Mac’s year is 2018 or later, you’ll need to restore its original security settings. The process is described in Step 4, below.

Step 4: If your Mac is from 2018 or later, restore the original security settings

If your Mac is from 2018 or later, follow these steps to restore the original security settings once Catalina has been installed:

  1. Restart your Mac and hold down the and R keys when you see the Apple logo. This puts the computer into recovery mode, which provides many setup options.
  2. In the menu bar, select Utilities, and then select Startup Security Utility from the list that appears.
  3. The Startup Security Utility window will appear:
    1. Under the Secure Boot section, select Full Security.
    2. Under the External Boot section, select Disallow booting from external media.
Tap to view at full size.
Categories
Career What I’m Up To

How I landed my job at Auth0

 

CRUSH THE FUNNEL

The opportunity

Icon: Calendar. “The beginning — August 18”I first became aware of opening for a “Senior R&D Content Engineer” at Auth0 on August 18th. You can see the job description here.

I did my research — because of course I did my research — and Auth0 turned out to be a very interesting opportunity for a number of reasons:

  • The position leans heavily on two skills that I have that aren’t seen in the same person that often: Programming and communications. I have lots of experience in these areas, and can bring my “A” game to the position.
  • Auth0 is in a business that is hot: Systems and information security, which is in demand as computing and networking becomes increasingly ubiquitous. The attractiveness of a hot business is obvious.
  • Auth0 is also in a business that is boring: To put it a little too simply, Auth0 is in the business of logins, which doesn’t sound terribly exciting. Here’s where things get counterintuitive — why would I want to get into a boring business? Partly because of an idea from entrepreneur and NYU marketing prof Scott Galloway, which is that boring businesses make money. It’s also an idea of mine, which is that “boring” businesses produce essential products and services. And in a world where identity and access control are crucial, and identity and access control service is essential. I’m all for this kind of boredom.
  • Auth0 is one of the standouts in a field with a few key players. There’s the companies that specialize in identity and authorization, such as Okta and Ping Identity, and then there are the giants such as Microsoft, IBM, and Oracle. If the 2019 Gartner Magic Quadrant for Access Management is to believed (and you should always read these graphs with some healthy skepticism), it’s at the top of the “Visionaries” quadrant:
Graph: Gartner “Magic Quadrant” for Access Management, 2019. The x-axis is “completeness of vision”, and the y-axis is “ability to execute”. The lower-left quadrant (“Niche players”) contains Optimal IdM, SecureAuth, and Atos (Evidian). The lower-right quadrant (“Visionaries”) contains Micro Focus, Broadcom (CA Technologies), OneLogin, Idaptive, ForgeRock, and Auth0, with Auth0 at the top. The upper-right quadrant (“Leaders”) contains Oracles, IBM, Ping Identity, Microsoft, and Okta.
The Gartner “Magic Quadrant” for Access Management, 2019. Tap to view at full size.

Everyone in the desirable top right quadrant, “Leaders”, is either an old guard fingers-in-every-tech-pie company (IBM, Microsoft, Oracle), or has been in the identity/access business for over a decade (Okta was founded in 2009; Ping Identity goes back to 2002 — when there were iPods, and they had click-wheels). Auth0 was founded in 2013, and of all the up-and-comers in its space, it’s at the top. That means room to grow, opportunities to apply my talents, and a chance to shine.

Crush the funnel

I combed my way through the most recent two years of the Auth0 blog and found two very useful articles:

These two articles gave me a lot of useful information about what it would take to land a job there: Namely, a focused effort, the willingness to run through a series of gauntlets, as pictured below…

Illustration: Dots showing the Auth0 hiring process in a progression from left to right: SCreener, interviews, tech exercise, demo, CTO chat, Auth0 logo.

…and being ready to put in the energy to face their hiring funnel.

Here’s one depiction of the funnel, from their first “How We Hire Engineers” article:

Graph: First version of the candidate breakdown graph, showing 126 incoming candidates, 19 qualified applications (passed the screener), and 5 selected and hired.

Here’s a revised version, from a few months later:

Graph: Second version of the candidate breakdown graph, showing 159 applicants, 9 who made it to the qualifier screening call, 4 who made it to interviews, 2 who made it to the technical exercise, and 2 who were hired.
Tap the image to view it at full size.

The numbers above aren’t for the position I applied for, but for other Senior Engineer positions.

I really wanted this job. In order to beat these odds, my number one priority for the six weeks to come was to crush this funnel.

Step 0: Sending in an application

This is a software-as-a-service company, and in the time honored tradition of indexing in software, the first step was Step 0! This involved filling out an application form and including the following “cover letter” which was actually a large text area on the application form.

Applicants were encouraged to explain why they should be considered for the job. I first wrote it in a text editor, saved it for my records, and pasted it into the form. Here’s what it said:

I’m a technical evangelist, developer, and tech community builder, and I would love to help Auth0 make the internet safer as a Senior R&D Content Engineer!

I have a long history of helping both techies and laypeople make sense of technology in many ways: As a technology conference organizer, an author, a presenter, and in running technical meetup groups. I even had my own technology show for children, complete with puppet co-host.

Even though COVID-19 caused my last job to evaporate, I’ve managed to keep busy:

  • I’ve spent the past five weeks in the inaugural cohort of the “UC Baseline” cybersecurity program offered by Tampa Bay’s security guild, The Undercroft. All the instructors will attest to my ability to not just absorb new material, but to communicate, cooperate, and share knowledge with others.
  • I’ve also been teaching an introductory Python course on behalf of Computer Coach Training Center. There was local demand for this course, but they didn’t have any Python instructors. They contacted me, having see my blog and recent presentations on game development in Python and Ren’Py.
  • Finally, I made revisions for the 2020 edition of the book iOS Apprentice, which teaches iOS app development by walking the reader through the process of writing four iPhone/iPad apps. I co-wrote the 2019 edition with Eli Ganim for RayWenderlich.com, and it spans 1500 pages.

In addition to this recent work, I’ve also done the following:

  • I’m the editor and author of Global Nerdy, a technology blog that I’ve written since 2006. It has nearly 4,000 articles and over 9 million pageviews. It’s also the home of the weekly Tampa Bay Tech, Entrepreneur, and Nerd Events mailing list, which I maintain.
  • I’m the author/developer/presenter for the video tutorial Beginning ARKit, which teaches augmented reality application development by writing four ARKit-based iPhone/iPad apps.
  • I was the top-rated presenter at the RWDevCon 2018 mobile developer tutorial conference, where I gave both a four-hour workshop and a two-hour presentation on augmented reality programming for iOS with ARKit.

I have years of experience in technical communications and instruction, having done the following:

  • Provided wide-ranging partner and developer training as a Developer Evangelist at Microsoft, from providing presentations to partners, to writing articles and editing the Canadian edition of MSDN Flash to running hackathons, giving presentations, organizing conferences, and doing interviews with technology media. I was also Microsoft Canada’s most prolific blogger.
  • At GSG, I worked closely with their biggest partner, IBM, to help develop both the technical documentation and marketing messaging for their Network Infrastructure Cost Optimization offering, including writing, producing and narrating the promotional video.
  • Provided technical expertise to SMARTRAC’s partners as they used the Smart Cosmos platform and SMARTRAC RFID technology to keep track of goods and physical assets as they are manufactured, shipped, and sold.

I’m an active participant in the Tampa Bay tech scene. I’m part of the organizing teams behind BarCamp Tampa Bay and Ignite Tampa Bay (my 2015 Ignite talk was included in the “Best Of” list), my blog posts are included as a regular part of the Tampa Bay Tech newsfeed, and I was part of the Tampa Bay team to made it to the finals at Startup Bus 2019.

Whenever someone asks me for advice about identity or authenticating and authorizing users in their applications, my stock answer is “Go with Auth0. They’ve already figured out the hard stuff.” With my unusual skill set and experience, I could do that in a more in-depth way at Auth0 as Senior R&D Content Engineer.

Step 1: A phone conversation with Wendy from the People Team

Photo: “Selfie” featuring Joey deVilla in a sport jacket and dress shirt sitting at his desks in his home office, with MacBook Pro and three monitors in the background.
Yes, I dressed up for a PHONE interview. The interviewer didn’t know, but *I* did. Tap to see the original blog post from August 25th.

The application must’ve worked, because I made it to Step 1, the “recuriter screener” phase, where I talked to Wendy Galbreath from the People Team. As the Auth0 blog puts it, it wasn’t a tech interview, but “a high-level conversation about my experience — especially with remote work, interest in Auth0, the role and expectations.”

As I blogged that day:

All dressed up for a 📱 PHONE ☎️ interview. Sure, they won’t know I’m dressed up, but I’LL KNOW.

The interview itself took about a half hour, and I did about 90 minutes of prep beforehand, looking into at the Auth0 site, checking recent news about the company, and reviewing Wendy’s LinkedIn profile.

She went into detail about the perks of working for Auth0, which further reinforced my desire to join, and I told her about my background and work experience, and why I thought I’d be a valuable addition to the team, using my best “radio voice” while doing so.

Step 2: Zoom interview with Tony, Head of Content

Photo: “Round Two!” — Another “Selfie” featuring Joey deVilla in a sport jacket and dress shirt sitting at his desks in his home office, with MacBook Pro and three monitors in the background.
The second interview was with Tony Poza, Auth0’s Head of Content. Tap to see the original blog post from August 28th.

I passed Step 0, which meant that three days later, I had a zoom conversation with Tony Poza, Auth0’s Head of Content. This conversation was a little more technical, where I  talked about my experience developing software, overseeing the development of software, doing developer evangelism, and creating content.

This interview was just over an hour, and I did around 4 hours’ worth of prep and background reading, including the Auth0 documentation, articles on their developer blog, and looking into the OAuth2 protocol, which Auth0 uses.

I enjoyed talking with Tony, and the interview only made me want to work at Auth0 even more.

Step 3: Zoom interview with Holly and Dan, two Senior Engineers

Photo: Yet another “Selfie” featuring Joey deVilla in a dress shirt sitting at his desks in his home office, with two MacBook Pros and two monitors in the background. Several items in the photo are highlighted: COVID-19 “zoom mullet”, “Read questions that I wanted to ask on this screen”, “Read notes I wrote about the company and its tech, developer site, and API on this screen”, “Funky shirt (sartorial savoir faire)”, “Podcasting microphone”, “The Star Trek screen (i.e. Talk to the interviewer on this computer)”, “Jupyter Notebook at the ready for impromptu coding demos”, “Read notes about my experience on this computer”.
When it comes to interviews, I *DO NOT* mess around. Tap to view at full size.

I passed that second interview, so it was time for another Zoom conversation, this time with Senior R&D Content Engineers Holly Lloyd and Dan Arias. If hired, I’d be working with them every day, so it was in their best interest to get a better feel for who I am, what I can do, and if working with me would be a good experience.

This interview was also a shade over an hour, and I’d done around 8 hours’ worth of prep, background reading, and some noodling with Auth0 and Python.

The conversation was a lot of fun, and I left it thinking Yes, I can definitely work with this team.

Step 4: Technical exercise — article + code

I’ll admit without any shame that by this point, I was checking my email very regularly for messages from Auth0.

I didn’t have to wait long. Hours after the Step 3 interview, I’d been notified that I had moved to the Step 4: The technical exercise!

I was now at this point of the funnel:

Graph: Auth0 hiring process graph with giant “YOU ARE HERE” marker pointing to the second-last step: Exercise.

This was a good place to be. With the major interviews done, passing was no longer subject to the vagaries of me having an off day or one of the interviewers being in a bad or at least unreceptive mood. This stage is all about proving that I could do the job and do so while working with my prospective teammates.

Most other engineering candidates at Auth0 are being hired to build, fix, or maintain the Auth0 service, so it makes sense that their exercise is to build some kind of technical project and then present it in a “demo call”, where they walk the interviewer through the project, explain their design decisions, and demonstrate the working solution.

As an R&D Content engineering candidate, my primary work output won’t be software, but content — documentation, instructions, articles, guides, and other material of that sort. My assignment was to write a “how to” article and the accompanying project. The idea is to showcase things like:

  • Problem-solving and data sourcing technique
  • Resourcefulness
  • Writing and language proficiency
  • Attention to detail
  • Creativity

The assignment: Create a tutorial blog post explaining how to build and secure an API with Spring Boot, Kotlin, and Auth0.

My first thoughts:

  • Securing an API with Auth0. That makes sense.
  • Kotlin — nice! That’s definitely in my wheelhouse.
  • Spring Boot? I know what Spring is, and have made a career out of avoiding it. What the hell is Spring Boot?

Photo: “What the hell is a Hufflepuff?” meme, but with “Hufflepuff” crossed out and “Spring Boot” written in.

Since the exercise is partly a test of creativity, I was free to determine the kind of API that the reader of the tutorial would build. I thought I’d make it fun:

Photo: “A hot sauce API” — Photo of a tray full of hot sauce bottles, overlaid with the logos for Spring, Spring Boot, Kotlin, and Auth0.

It was an API for a catalog of hot sauces. For the benefit of the curious, here’s a summary:

API endpoint Description
GET api/hotsauces/test Simply returns the text **Yup, it works!**
GET api/hotsauces

Returns the entire collection of hot sauces.

Accepts these optional parameters:

  • brandNameFilter: Limits the results to only those sauces whose `brandName` contains the given string.
  • sauceNameFilter: Limits the results to only those sauces whose `sauceName` contains the given string.
  • descFilter: Limits the results to only those sauces whose `description` contains the given string.
  • minHeat: Limits the results to only those sauces whose `heat` rating is greater than or equal to the given number.
  • maxHeat: Limits the results to only those sauces whose `heat` rating is less than or equal to the given number.
GET api/hotsauces/{id} Returns the hot sauce with the given id.
GET api/hotsauces/count Returns the number of hot sauces.
POST api/hotsauce Adds the hot sauce (provided in the request).
PUT api/hotsauces/{id} Edits the hot sauce with the given id and saves the edited hot sauce.
DELETE api/hotsauces/{id} Deletes the hot sauce with the given id.

The article I wrote first walked the reader through the process of building the API. Once built, it then showed the reader how to secure it so that the endpoints for CRUD operations require authentication, while the “is this thing on?” endpoint remained public.

Icon: Slack icon.

I wasn’t alone during the exercise. They set up a Slack channel to keep me in touch with the team I was hoping to join, and it’s standard procedure to assign you a “go-to” person (Dan was mine). I maintained a good back-and-forth with them, keeping them apprised of my progress, asking questions, and once or twice even sharing photos of what I was making for dinner.

Illustration: Woodcut of an hourglass.

While they said I could take as long as I felt I needed to complete the project, I figured that I needed to keep a balance between:

  • giving myself enough time to handle all the unknowns and deliver a finely-honed article and accompanying project, and
  • not taking so long that I end up being disqualified. As Steve Jobs put it so succinctly: Real artists ship.

Photo: A van for Frontier, parked in a residential driveway.

On Day 2 of the project, while I was deep into working out how to use Spring Boot, a house down the street got connected to Frontier fiber internet. In the process, our house got disconnected. Luckily, I saw the truck down the street and straightened things out with the tech while he was still there.

I spent one Saturday working on the project with my computer tethered to my phone. Had I not caught the tech in time, the soonest I’d have been able to get someone to reconnect me would’ve been on Wednesday, a good four days later.

Photo: A computer screen showing “git push origin main”.

There came a point when I decided that the exercise was done and ready for evaluation. I made my final push to the repo and notified the team on Slack:

@channel I’d like to extend my most heartfelt thanks to everyone for this opportunity. It’s been fun, and I learned quite a bit in the process! As always, if there are any questions that you’d like me to answer, or anything else I can do for you, please let me know.

And then it was time to sit and wait. I checked Slack and my email a lot over those couple of days.

Step 5: BOSS FIGHT!
(Actually, an interview with Jarod, Director of Developer Relations)

I got an email three days later — a Friday afternoon — asking if I would be up for a last-minute Zoom interview with Jarod Reyes, Director of Developer Relations, who came to Auth0 in June from Twilio, where he was the Developer Evangelism Manager.

Naturally, I made myself available, and Step 5 took place late that afternoon, only a couple of hours after I got the email.

The webcam lights I’d ordered had arrived earlier that day, so I set them up quickly…

Photo: Joey’s MacBook Pro, with videochat lighting in the background.
Tap the photo to view it at full size.

…and I had just enough time to do a quick screen test for the interview. And yes, the accordion didn’t just happen to be there; it was strategically placed in the shot:

Photo: Joey deVilla in his home office, with his accordion in the background.
Actual screencap of my Zoom test prior to the interview.

The interview was friendly, brief, and half of it consisted of me asking Jarod questions about his plans for developer evangelism and content at Auth0.

With the call done, the weekend began. It’s been a while since I’ve impatiently waited for Monday to come around.

Step 6: The offer letter

Icon: Calendar — “The end: September 28”Monday, September 28th: I checked my email a lot, and at 1:15 p.m., this message arrived:

Great news!

The team would like to extend an offer for you to join Auth0!  Please let me know your availability today for a call so that I can share the details with you.

T minus one week

It’s been two weeks since I got the offer letter. Since then, I’ve signed it, filled out the standard paperwork, and even received the dongle for my company-issued MacBook Pro:

Photo: Box for an Apple USB-C to Digital AV multiport adapter.

There’ve been some longer-than-usual shipping times for Apple products lately, but I’m not too bothered by that. I’m very pleased that I’m in and excited to be back in the developer relations / content game again.

What does this mean for the Tampa Bay tech scene?

Photo: Satellite photo of Florida, with the Auth0 logo over Tampa Bay.

For starters, it means that Auth0, a unicorn and player in the security space, has an increased Tampa Bay presence. (I’m not the only Auth0 employee, or “Auziro”, in the area.)

As part of the Developer Relations team, it’s my job to be part of the face that Auth0 presents to the developer community, and conversely, a way for the developer community to reach Auth0. I’m Tampa Bay’s “person on the inside”.

As a public-facing employee of a startup who service overlaps with security, I expect that I’ll be participating in local startup and security events — first virtual ones, and eventually, once we’ve all managed to control the pandemic, real-life ones.

And finally, as a public-facing Auth0 representative, as well as the writer of this blog and the Tampa Bay tech, entrepreneur, and nerd events list, I hope to represent Tampa Bay as an excellent place for techies to live, work, and play in.

Keep an eye on this blog, as well as the Auth0 blog! There are many interesting developments coming, especially if your interests are in software, startups, or security.

Epilogue: Whatever became of that article?

Screenshot of the article on the Auth0 Developer Blog.

It was published on the Auth0 Developer Blog!