forget-trick-or-treat,-here-are-5-horrifying-technologies-that-should-really-scare-you!

Recent developments in AI have transformed our view of the future, and from certain angles, it doesn’t look pretty. Are we facing total annihilation? Slavery and subjugation? Or could a manmade super-intelligence save us from ourselves?

You know, I remember the good old days when all you had to worry about at Halloween was how to stop a gang of sugar-crazed 8 year-olds throwing eggs at your house. Not any more. Here are 5 emerging technologies that are bound to give you the creeps:

1. Quantum Supremacy

Perhaps the biggest tech news of 2019 came last month when Google announced “by mistake” cough that they’d completed a “10,000 year” calculation on their Sycamore quantum chip in 200 seconds. If the term “Supremacy” wasn’t sinister enough, the claim that this could render conventional encryption methods obsolete in a decade or so should give you pause for thought.

this could render conventional encryption methods obsolete

Just think about it for a second: that’s your bank account, all your passwords, biometric passport information, social security, cloud storage and yes, even your MTX tokens open and available to anyone with a working knowledge of Bose-Einstein condensates and a superconductor lab in their basement. Or not.

2. Killer Robots

To my mind, whoever dreamed up fast-moving zombies is already too depraved for words, but at least your average flesh-muncher can be “neutralised” with a simple shotgun to the face or — if you really have nothing else — a good smack with a blunt object. The Terminator, on the other hand (whichever one you like), a robot whose actual design brief includes the words “Killer” and “Unstoppable” in the same sentence, fills me with the kind of dread normally reserved for episodes of Meet the Kardashians.

autonomous drone swarms…detect their target with facial recognition and kill on sight on the basis of…social media profile

We already know for certain that Lethal Autonomous Weapons (LAWs for short…) are in active development in at least 5 countries. The real concern, though, is probably the multinationals who, frankly, will sell to anyone. With help from household names like Amazon and Microsoft, these lovely people have already built “demonstration” models of everything from Unmanned Combat Aerial Systems (read “Killer Drones”) and Security Guard Robots (gun-turrets on steroids) to Unmanned Nuclear Torpedoes. If that’s not enough for you, try autonomous drone swarms which detect their target with facial recognition and kill on sight on the basis of… wait for it…“demographic” or “social media profile”.

Until recently, your common-or-garden killer robot was more likely to hurt you by accidentally falling on top of you than through any kind of goal-directed action, but all that’s about to change. Take Boston Dynamics, for example: the DARPA funded, Japanese owned spin-out from MIT whose humanoid Atlas can do parkour, and whose dancing quadruped SpotMini looks cute until you imagine it chasing you with a taser bolted to its back.

The big issue here is the definition of “Autonomous”. At the moment, most real world systems operate with “Human in the Loop”, meaning that even if it’s capable of handling its own, say, target selection, a human retains direct control. “Human on the Loop” systems however, allow the machine to operate autonomously, under human “supervision” (whatever that means). Ultimately, more autonomy tends towards robots deciding for themselves to kill humans. Does anyone actually think this is a good idea?!

3. The Great Brain Robbery

If the furore around Cambridge Analytica’s involvement in the 2016 US Presidential election is anything to go by, the world is gradually waking up to the idea that AI can be, and is being used to control us. The evidence is that it works, not just by serving up more relevant ads, or allowing content creators to target very specific groups, but even by changing the way we see ourselves.

Careful you may be, but Google, Facebook and the rest probably still have gigabytes of information on you, and are certainly training algorithms on all kinds of stuff to try to predict and influence your behavior. Viewed like this, the internet looks less like an “information superhighway” and more like a swamp full of leeches, swollen with the lifeblood of your personal data (happy Halloween!).

4. Big Brother

I don’t know about you, but I’m also freaking out about Palantir, the CIA funded “pre-crime” company whose tasks include tracking, among other kinds of people, immigrants; not to mention the recent memo by the US Attorney General which advocates “disrupting” so-called “challenging individuals” before they’ve committed any crime. Call me paranoid, but I’ve seen Minority Report (a lot) and if I remember right, it didn’t work out well… for anyone!

This technology is also being used to target “subversive” people and organisations. You know, whistleblowers and stuff. But maybe it’s not so bad. I mean, Social and Behavior Change Communication sounds quite benign, right? Their video has some fun sounding music and the kind of clunky 2D animation you expect from… well no-one, actually… but they say they only do things “for the better”… What could possibly go wrong? I mean, the people in charge, they all just want the best for us, right? They wouldn’t misuse the power to make people do things they wouldn’t normally do, or arrest them before they’ve done anything illegal, right guys? Guys…?

5. The Ghost in the Machine

At the risk of wheeling out old clichés about “Our New Silicon Overlords”, WHAT IF AI TAKES OVER THE WORLD?!

I’ll keep it short.

Yes, there’s a chance we might all be enslaved, Matrix style, by unfeeling, energy-addicted robots. Even Stephen Hawking thought so. There’s also the set of so-called “Control Problems” like Perverse Instantiation where an AI, given some benign-sounding objective like “maximise human happiness”, might decide to implement it in a way that is anything but benign – by paralysing everyone and injecting heroin into their spines, perhaps. That, I agree, is terrifying.

But really, what are we talking about? First, the notion of a “control problem” is nonsense: Surely, any kind of intelligence that’s superior to ours won’t follow any objective we set it, or submit to being “switched off” any more than you would do what your dog tells you… oh no wait, we already do that.

Surely, any kind of intelligence that’s superior to ours won’t follow any objective we set it

Second, are we really so sure that our “dog-eat-dog” competitive approach to things is actually all there is? Do we need to dominate each other? Isn’t it the case that “super” intelligence means something better? Kinder? More cooperative? And isn’t it more likely that the smarter the machines become, the more irrelevant we’ll be to them? Sort of like ants are to us? I mean, I’m not sure I fancy getting a kettle of boiling water poured on me when I’m in the way but, you know… statistically I’ll probably avoid that, right?

Lastly, hasn’t anyone read Hobbes’ Leviathan? If a perfect ruler could be created, we should cast off our selfish individuality and surrender ourselves to the absolute sovereign authority of… ok, I’ll stop.

So, Are We Doomed or What?

Yes. No! Maybe. There are a lot of really scary things about AI but you know what the common factor is in all of them? People. We don’t know what a fully autonomous, super intelligent machine would look like, but my hunch is it would be better and kinder than us. What really makes my skin crawl are the unfeeling, energy-addicted robots who are currently running the show. In their hands, even the meagre sketches of intelligence that we currently have are enough to give you nightmares.

Candy, anyone?

Featured image via Dick Thomas Johnson.

let?s-not-forget-about-container-queries

Container queries are always on the top of the list of requested improvements to CSS. The general sentiment is that if we had container queries, we wouldn’t write as many global media queries based on page size. That’s because we’re actually trying to control a more scoped container, and the only reason we use media queries for that now is because it’s the best tool we have in CSS. I absolutely believe that.

There is another sentiment that goes around once in a while that goes something like: “you developers think you need container queries but you really don’t.” I am not a fan of that. It seems terribly obvious that we would do good things with them if they were available, not the least of which is writing cleaner, portable, understandable code. Everyone seems to agree that building UIs from components is the way to go these days which makes the need for container queries all the more obvious.

It’s wonderful that there are modern JavaScript ideas that help us do use them today — but that doesn’t mean the technology needs to stay there. It makes way more sense in CSS.

Here’s my late 2019 thought dump on the subject:

  • Philip Walton’s “Responsive Components: a Solution to the Container Queries Problem” is a great look at using JavaScript’s ResizeObserver as one way to solve the issue today. It performs great and works anywhere. The demo site is the best one out there because it highlights the need for responsive components (although there are other documented use cases as well). Philip even says a pure CSS solution would be more ideal.
  • CSS nesting got a little round of enthusiasm about a year ago. The conversation makes it seem like nesting is plausible. I’m in favor of this as a long-time fan of sensible Sass nesting. It makes me wonder if the syntax for container queries could leverage the same sort of thing. Maybe nested queries are scoped to the parent selector? Or you prefix the media statement with an ampersand as the current spec does with descendant selectors?
  • Other proposed syntaxes generally involve some use of the colon, like .container:media(max-width: 400px) { }. I like that, too. Single-colon selectors (pseduo selectors) are philosophically “select the element under these conditions” — like :hover, :nth-child, etc. — so a media scope makes sense.
  • I don’t think syntax is the biggest enemy of this feature; it’s the performance of how it is implemented. Last I understood, it’s not even performance as much as it mucks with the entire rendering flow of how browsers work. That seems like a massive hurdle. I still don’t wanna forget about it. There is lots of innovation happening on the web and, just because it’s not clear how to implement it today, that doesn’t mean someone won’t figure out a practical path forward tomorrow.
forget-the-iphone-11,-everyone’s-talking-about-the-colourful-new-apple-logo

Apple logo

(Image credit: Apple)

Invites to Apple’s next major launch event, which will see the unveiling of the iPhone 11, have recently been distributed. Save 10 September in your diaries Apple fans, because this is the date of the special event, which will take place at The Steve Jobs Theater in Cupertino.

But rather than everyone focussing their attention on what might be on the cards for the shiny new iPhone 11, it’s the event invite that’s got everyone talking. An (obviously) minimal design, the invitation design simple features a beautiful new colourful version of the iconic Apple logo and a witty one-liner reading “by innovation only”. There have been rumours that Apple might bring back its rainbow logo, so is this a tease of an imminent redesign? (We can’t help but hope so!)

Twitter is rife with speculation, with some suggesting the new logo design features the different colours the iPhone 11 will be available in. Whatever it means, given that the iPhone is already a popular tool with creatives – it’s a regular in our best camera phones round-up – the unveiling of a shiny new version promises to be an exciting event. 

Apple logo

Will you be there?

(Image credit: Apple)

As for the iPhone 11, this is expected to arrive in three flavours. These will likely be the standard iPhone 11, the iPhone 11 Pro, and the iPhone 11 Pro Max. We’ve already got an idea what the iPhone 11 could look like, but how about its technical capabilities?

According to Tech Crunch, these models are rumoured to feature a triple camera array with an ultra-wide lens. Could this supersede the iPhone XS camera? The iPhone 11 is also suggested to boast a new A13 chip, and wireless power sharing.

The invite comes days after Microsoft sent press invites for its Surface event. And just like the Apple invite, it featured a unique logo that excited interest with the press. We’ll have to wait until 2 October to see if Microsoft launches a dual-screen Surface device though.

Related articles:

forget-the-computer-–-here’s-why-you-should-write-and-design-by-hand

“The hand is the window on to the mind.” — Immanuel Kant

Herbert Lui

Steve Jobs writing on a whiteboard. Image via: Severdia

J.K. Rowling scribbled down the first 40 names of characters that would appear in Harry Potter in a paper notebook. J.J. Abrams writes his first drafts in a paper notebook. Upon his return to Apple in 1997, Steve Jobs first cut through the existing complexity by drawing a simple chart on whiteboard. Of course, they’re not the only ones…

Michael Bierut’s notebook. Image via: New York Times

Here’s the notebook that belongs to Pentagram partner Michael Bierut. Most of the pages in his notebook resemble the right side, although he has said to Design Observer that he had lost a particularly precious notebook, which contained “a drawing my then 13-year-old daughter Liz did that she claims is the original sketch for the Citibank logo.”

Neil Gaiman’s notebook. Image via: Buzzfeed News

Author Neil Gaiman’s notebook, who writes his books — including American Gods, The Graveyard Book, and the final two thirds of Coraline — by hand.

Information designer Nicholas Felton’s notebook. Image via: Fast Company

And a notebook from information designer Nicholas Felton, who recorded and visualized ten years of his life in data, and created the Reporter app.

There’s a reason why people, who have the option to actually use a computer, choose to make writing by hand a part of their creative process. And it all starts with a difference that we might easily overlook — writing by hand is very different than typing.

Your equipment influences your work

Natalie Goldberg writing in her studio. Image via: Halaman Kebudayaan Indonesia

In Writing Down the Bones, author Natalie Goldberg advises that writing is a physical activity, and thus affected by the equipment you use. Typing and writing by hand produce very different writing. She writes, “I have found that when I am writing something emotional, I must write it the first time directly with hand on paper. Handwriting is more connected to the movement of the heart. Yet, when I tell stories, I go straight to the typewriter.”

Goldberg’s observation may have a small sample size of one, but it’s an incisive observation. More importantly, studies in the field of psychology support this conclusion.

In chapter 13 of “Traditions of Writing Research”, entitled, “Relationships between idea generation and transcription,” authors John R. Hayes and Virginia Berniger conduct a study in which they learned children could generate significantly more ideas by handwriting than by typing.

Similarly, authors Pam A. Mueller and Daniel M. Oppenheimer students making notes, either by laptop or by hand, and explored how it affected their memory recall. In their study published in Psychological Science, they write, “…even when allowed to review notes after a week’s delay, participants who had taken notes with laptops performed worse on tests of both factual content and conceptual understanding, relative to participants who had taken notes longhand.”

While psychologists figure out what actually happens in the brain, artists, designers, and writers all have felt the difference in typing and writing by hand. Many who originally eagerly adopted the computer for the promises of efficiency, limitlessness, and connectivity, have returned back to writing by hand.

There are a variety of hypotheses that exist on why writing by hand produces different results than typing, but here’s a prominent one that emerges from the world of practitioners:

You better understand your work

Jennifer Egan’s notebook. Image via: The New Yorker

“Drawing is a way for me to articulate things inside myself that I can’t otherwise grasp,” writes artist Robert Crumb in his book with Peter Poplaski. In other words, Crumb draws not to express something already he already understand, but to make sense of something he doesn’t.

This brings to mind a quote often attributed to Cecil Day Lewis, “ We do not write in order to be understood; we write in order to understand.” Or as author Jennifer Egan says to The Guardian, “The writing reveals the story to me.”

This sort of thinking — one that’s done not just with the mind, but also with the hands — can be applied to all sorts of fields. For example, in Sherry Turkle’s “Life on the Screen,” she quotes a faculty member of MIT as saying:

“Students can look at the screen and work at it for a while without learning the topography of a site, without really getting it in their head as clearly as they would if they knew it in other ways, through traditional drawing for example…. When you draw a site, when you put in the contour lines and the trees, it becomes ingrained in your mind. You come to know the site in a way that is not possible with the computer.”

The quote continues in the notes, “That’s how you get to know a terrain — by tracing and retracing it, not by letting the computer ‘regenerate’ it for you.”

Renzo Piano’s sketch of Harvard Art Museums Renovation and Expansion. Image via: Archdaily

“You start by sketching, then you do a drawing, then you make a model, and then you go to reality — you go to the site — and then you go back to drawing,” says architect Renzo Piano in Why Architects Draw. “You build up a kind of circularity between drawing and making and then back again.”

Image via: Orbiting the Giant Hairball by Gordon Mackenzie

In his book, Orbiting the Giant Hairball, author Gordon MacKenzie likened the creative process to one of a cow making milk. We can see a cow making milk when it’s hooked up to the milking machine, and we know that cows eat grass. But the actual part where the milk is being created remains invisible.

There is an invisible part to making something new, the processes of which are obscured from physical sight by scale, certainly. But, parts of what we can see and feel, is felt through writing by hand.

Steve Jobs said in an interview with Wired Magazine, “Creativity is just connecting things. When you ask creative people how they did something, they feel a little guilty because they didn’t really do it, they just saw something. It seemed obvious to them after a while. That’s because they were able to connect experiences they’ve had and synthesize new things. And the reason they were able to do that was that they’ve had more experiences or they have thought more about their experiences than other people.”

Viewed from Jobs’s lens, perhaps writing by hand enables people to do the latter — think and understand more about their own experiences. Similar to how the contours and topography can ingrain themselves in an architect’s mind, experiences, events, and data can ingrain themselves when writing out by hand.

Only after this understanding is clearer, is it best to return to the computer. In the middle of the 2000s, the designers at creative consultancy Landor installed Adobe Photoshop on their computers and started using it. General manager Antonio Marazza tells author David Sax:

“Overnight, the quality of their designs seemed to decline. After a few months of this, Landor’s Milan office gave all their designers Moleskine notebooks, and banned the use of Photoshop during the first week’s work on a project. The idea was to let their initial ideas freely blossom on paper, without the inherent bias of the software, before transferring them to the computer later for fine-tuning. It was so successful, this policy remains in place today.”

Austin Kleon’s analog and digital workstations. Image via: From Your Desk

Author Austin Kleon has applied this principle to his office layout. He says to From Your Desks, “When I get home, I have two desks in my office — one’s ‘analog’ and one’s ‘digital.’ The analog desk has nothing but markers, pens, pencils, paper, and newspaper. Nothing electronic is allowed on the desk — this is how I keep myself off Twitter, etc. This is where most of my work is born. The digital desk has my laptop, my monitor, my scanner, my Wacom tablet, and a MIDI keyboard controller for if I want to record any music. (Like a lot of writers, I’m a wannabe musician.) This is where I edit, publish, etc.”

Final Thoughts

J.K. Rowling used this piece of lined paper and blue pen to plot out how the fifth book in the series, Harry Potter and The Order of the Phoenix, would unfold. The most obvious fact is that it looks exactly like a spreadsheet.

And yet, to say she could have done this on the spreadsheet would be a stretch. The magic isn’t in the layout, which is just the beginning. It’s in the annotations, the circles, the cross outs, and marginalia. I realize that there are digital equivalents to each of these tactics — suggestions, comments, highlights, and changing cell colors, but they simply don’t have the same effect.

Rowling writes of her original 40 characters, “It is very strange to look at the list in this tiny notebook now, slightly water-stained by some forgotten mishap, and covered in light pencil scribblings…while I was writing these names, and refining them, and sorting them into houses, I had no clue where they were going to go (or where they were going to take me).”

Goldberg writes in her book, that writing is a physical act. Perhaps creativity is a physical, analog, act, because creativity is a byproduct of being human, and humans are physical, analog, entities. And yet in our creative work, out of convention, habit, or fear, we restrict ourselves to, as a man would describe to author Tara Brach, “live from the neck up.”

Nowadays, the practice of writing by hand is dwindling — to our detriment. You don’t need to do all of your work by hand in order to see its benefits. Instead, the next time you boot up Google Docs, Photoshop, or AutoCAD, try opening a journal instead. You may be pleasantly surprised at how your work turns out.