Sunday, May 24, 2015

Mysterious Project Involves Circuit Spider

I'm working on a project. A circuit spider is involved.

Tuesday, May 19, 2015

Update re Live Electronic Music Performance Design

Last year, I wrote about an idea I had — a way to bring back some elements of classic raves and see those elements survive better. Over the years, rave music has done incredibly well, while rave events have almost been crushed by governments. The difference is, at a concert, the artists are special and the audience is there to see them, while at a classic rave, things were set up for the dancers first and the DJs second.

My design was a hybrid of a concert, a rave, a hippie drum circle, and a video game arcade. A performer plays drums in the center, while audience members can play along in little drum pods scattered in a circle around the performer. All the drums are electronic. They play sounds like a normal drum, but they also trigger video software. Thus the "audience" plays a part in creating the event.





Contrast this with the amazing setup the Glitch Mob uses to perform live:



I love this and hate it too. It's excellent work. It's beautiful. It's badass. It incorporates, updates, and adapts Japanese taiko drums while still remaining respectful of the original source material — a balance which artists often get wrong. But it's still 100% about the artists being special and the audience being there to see them, which to me seems much more like a rock concert than a club or a rave.

Also, at one point in the video, one of the Glitch Mob's shocked that a Hollywood set designer can draw, which is kind of ridiculous, because that's the job. And as a programmer, there are moments which seem silly too, because they wrote custom software, but all it seems to do is function as a bus for controller input. In the age of Overtone and Quil, that's kind of a letdown, especially given the stuff other artists are doing with custom software.

Anyway, back to my own design: in addition to these little 3D sketches. I also wrote a basic version of the software I had in mind. My drumming in this video is terrible, and so is the software, really, but it illustrates the basic idea. Hitting the drums triggers color changes in computer-generated visuals.



I've been outdone in this category as well. This is a promo video for the Critter & Guitari Rhythm Scope, an analog video synthesizer which responds to sound:



The interesting thing about this, to me, is that it's analog rather than digital.

Speaking of which, the performer in my design had a strobe light attached to their drum set. It's the little black box with a grey mesh:



I bought a strobe light and attempted to integrate it with my electronic drum kit using a protocol called DMX. Got absolutely nowhere, although I found some existing solutions in Ruby and Node (using both CoffeeScript and JavaScript).

But I've discovered that a small company in Italy makes a Eurorack solution for this, which links DMX to CV instead of MIDI. (Rolling your mouse over the bottom of the video brings up an audio control, although this rather stupidly assumes you're on a computer, not a phone or a tablet.)



More news as events warrant.

Monday, May 18, 2015

Underrated Synth: The Korg Wavestation

Never buy music gear without looking it up on YouTube and Vintage Synth Explorer first (or Gearslutz, or ModularGrid). And never pay for music gear without looking up the price history on eBay for the last three months. It takes five seconds and it'll save you a lot of money.

I recently found a Korg Wavestation SR on eBay for less than $200. Although they sometimes go for less, they usually go for more. I've had my eye out for a Wavestation for a long time, and seeing one in Richie Hawtin's live setup didn't hurt. But to be sure, I checked YouTube, where I found this demo:



It's painfully 90s, and it sounds as if it was made on the same machine that the entire X-Files theme song was made on, but that's because it was. It's also kind of awesome, in a painfully 90s way.

Wednesday, May 13, 2015

Strong Parameters Are A Weak Schema

Ruby on Rails went off the rails a long time ago.



I don't work with Rails today. But, like so many other developers, I kept working with Rails for many years after the Merb merge. Because I loved Ruby, and because the Rails developer experience remains a thing of beauty, even today.

I stuck around for Rails 4, and one of the changes it made was silly.
Rails has always had a nice way of sanitizing user input coming from ubiquitous forms. Up until Rails 3, the solution was to list accessible fields right in your models. Then Rails 4 came along and introduced a different solution - strong_parameters, allowing you to take a greater control over the sanitizing process.
As is often the case with Rails, the real problem here is that the core team failed to recognize a classic problem of computer science, after underestimating the importance of API-centric web development, and perceiving the problem purely in terms of showing a web page to a user.

What Rails Was Thinking


Before I get into that, I just want to summarize the problem from the Rails perspective: you've got input coming in from users, who are filling out web forms. They might be up to mischief, and they might use your web form to cause trouble. So you have to secure your web forms.

The classic Rails solution for securing a web form: attr_accessible. Since models are the only way Rails puts anything into a database, you can recast "securing a web form" as validating an object. It makes perfect sense to say that code which secures an object's validity belongs in that object. So far, so good.

attr_accessible was a white-listing mechanism which allowed you to specify which model attributes could be mass-assigned. The default for updating or creating an object in Rails, update_attributes, would allow a user to update any aspect of a model, including (for example) their User.id or their authorization privileges.

But this whitelisting was disabled by default. You had to kick it into gear by calling attr_accessible at least once, in your model. People forgot to do this, including people at GitHub, a very high-profile company with great developers, which got very visibly hacked as a result. People responded by writing initializers:

ActiveRecord::Base.update_attributes(nil)

(Obviously, a better way to do that would be to wrap it in a method called enable_whitelist or something, but that's a moot issue now.)

People also responded by writing plugins, and in Rails 4, one of these plugins moved into Rails core.

So this is what changed:
  • attr_accessible had an inverse, attr_protected, which allowed you to use a blacklist instead of a whitelist. strong_parameters only permits a whitelist.
  • The whitelisting default changed from off to on.
  • The code moved from the model to the controller.
David Heinemeier-Hansson wrote up the official rationale. I've added commas for clarity:
The whole point of the controller is to control the flow between user and application, including authentication, authorization, and, as part of that, access control. We should never have put mass-assignment protection into the model, and many people stopped doing so long ago ...

An Alternative Approach


Let's look at this from a different perspective now.

Say you're building a web app with Node.js, and you want to support an API as well as a web site. We can even imagine that your mobile app powers much more of your user base, and your web traffic, than your actual web site does. So you need to protect against malicious actors exploiting your web forms, as web apps always have. But you also need to protect against malicious actors exploiting your API traffic.

At this point, it's very easy to disagree with Mr. Hansson's claim that "we should never have put mass-assignment protection into the model." Both the "protect against malicious actors" problems here are very nearly identical. You might have different controllers for your API and your web site, and putting mass-assignment protection into those controllers could mean implementing the same code twice. Centralizing that code in the relevant models might make more sense.

Rails solves this by quasi-centralizing the strong_parameters in a private method, typically at the bottom of the controller file. Here's the example from the official announcement:

But you could also just use JSON Schema. All your web traffic's probably using JSON anyway, all your code's in JavaScript already, and if you write up a schema, you can just stipulate that all incoming data matches a particular format before it gets anywhere near your application code. You can put all that code in one place, just as you could with models, but you move the process of filtering incoming input right up into the process of receiving input in the first place. So when you do receive invalid input, your process wastes less resources on it.

(This is kind of like what Rails did, except you can put it in the server, which in Rails terms would be more like putting it in a Rack middleware than in a controller.)

The funny thing is, writing a schema is basically what Rails developers do already, with strong_parameters. They just write their schemas in Ruby, instead of JSON.

Here's a less cluttered example:

Note especially the very schema-like language in this line:

params.require(:email).permit(:first_name, :last_name, :shoe_size)

All you're doing here is permitting some attributes and requiring others. That's a schema. That's literally what a schema is. But, of course, it lacks some of the features that a standard like JSON Schema includes. For instance, specifying the type of an attribute, so mischevious Web gremlins can't fuck up your shit by telling you that the number of widgets they want to purchase is `drop table users`. (Rails has other protections in place for that, of course, but the point is that this is a feature any schema format should provide.)

Rails developers are writing half-assed schemas in Ruby. If/when they choose to farm out parts of their system to microservices written in other languages, they'll have to re-write these schemas in other languages. For instance, they might at that point choose to use a standard, like JSON Schema. But if you're building with the standard from the start, you only have to define that schema once, using one format.

In fact, Rails developers typically re-write their schemas whether they refactor to microservices or not. Many Rails developers prefer to handle API output using active_model_serializers, which gives you a completely different Ruby-based schema format for your JSON output.

Here's an example from the README:

This code says "when you build the output JSON, serialize the name and body attributes, include post_id, and add some hypermedia-style URLs as well." It infers a lot from the database, and it's nicer to read than JSON Schema. But you can't infer a lot from a database without some tight coupling, and this syntax loses some of its appeal when you put it side-by-side with your other implicit Ruby schema format, and you have to remember random tiny distinctions between the two. It's kind of absurd to write the same schema two or three different times, especially when you consider that Ruby JSON parsing is so easy that your JSON Schema objects can be pure Ruby too if you want.

strong_parameters really only makes sense if you haven't noticed basic things about the Web, like the fact that HTTP has a type system built in.

Sunday, May 10, 2015

An Annoyance: "Virtuoso Mixer Players"

When a radio interviewer asked deadmau5 about his disrespect for DJing as an art form, deadmau5 said he didn't know any virtuoso mixer players.



But anybody who knows anything about dub music knows that dub is all about virtuoso mixer players.



Turntablism is also all about virtuoso mixer players.



When I saw Underworld perform live, one member of the band was only playing the mixer at all times.



If you don't know of any virtuoso mixer players, then you don't know about Underworld - one of the greatest electronic acts of all time - and you don't know about dub or turntablism - two entire genres which have both shaped electronic music.

deadmau5 is an interesting producer and his live shows have tremendous style. But he talks a lot of shit, and he doesn't always know what he's talking about.

Here's Orbital playing mixers, synths, iPads, and more.



And here's a virtuoso mixer player.

Thursday, May 7, 2015

My Skepticism Re: Uber Continues Unabated

Previously: What If Uber's Just A Terrible Business?

There's a City Paper article online where the author worked as an Uber driver and talks about how it works, especially the practical, can-I-make-a-living economics of it.

This web site is so awful that you apparently have to open up Developer Tools to read it, and execute this line of code in the console:

function contentRefresh(){}

But the content's great. Spoiler alert: drivers can't make money. Uber's pricing strategy matches my fear that they're using venture capital to buy off entire cities by flooding them with amazing deals which seem like they can't last forever — because they can't — while building up the economic equivalent of a mountain of legacy code which collapses into an utter shitpile the day somebody buys the company. Only in this case, the acquirer's probably a whole bunch of people buying stock.

But what really kicked my skepticism into high gear was this excerpt:
TechCrunch broke the news that Uber was building a huge robotics lab in Pittsburgh, partnering with Carnegie Mellon University to "kickstart autonomous taxi fleet development," according to a company source.

Travis Kalanick, the CEO and founder of Uber, said at a conference last year that he'd replace human Uber drivers with a fleet of self-driving cars in a second...

You can get a glimpse of his vision in a fascinating paper from Columbia University, which did several case studies on what a future with driverless cars would look like — apparently, like Uber crossed with Minority Report. And this could be coming as soon as 2020, according to both Tesla and Google, both of which are also heavily invested in the race to be a player in this huge future market.

In this world, the paper projects, fewer and fewer people own private cars, because it doesn’t make financial sense. Cars run on electricty, and most are much smaller, designed to carry only one or two people. The auto industry experiences a temporary boom, but then demand drops off a cliff. By around 2040, driverless cars are a majority on American roads. The number of cars drops by more than 90%, as do fuel consumption and emissions. Car accidents and traffic are nearly nonexistent.
So, a little context.

First, I'm a programmer, and like any programmer, I know most people's code sucks.



Second, in the late 1990s, the whole sales pitch for why young people should build the Web in the first place was that it would democratize media, eliminate the shallow hegemony of things like NBC and CNN, and replace it with in-depth, nuanced debate, because everything would be better once it revolved around the written word.

This is what we got instead.


Take the "which hormone-crazed moose are you?" quiz!

Maybe you don't remember the 1990s. Maybe you weren't there. Do you remember Heartbleed? Because that was last year. Do you remember the day the whole Internet discovered that none of our security had ever been working, and how relieved we all were when the entire rest of the world failed to notice, and civilization didn't collapse?

One mistake that stupid in one technology underpinning robot cars, and the entire world will notice.

I would have bought "by around 2040, driverless cars are a majority on American roads." It's entirely possible. But "car accidents and traffic are nearly nonexistent" makes the whole thing stink of bullshit so much that nothing can redeem those paragraphs now.

Say you want to replace trucks with robots. This is a worthy goal, and in fact I know of a cool startup headed in a similar direction. There are probably many. But keep in mind that long-haul cargo trucks are often operated by poorly-educated individuals driving under the influence of crystal meth. Your startup does not need to achieve a "nearly nonexistent" accident rate to succeed. It only needs to outperform poorly-educated individuals who are driving under the influence of crystal meth.

Also, cars are only sized for people because people make up the majority of vehicle cargo, and the entirety of vehicle operators. But if your vehicle's operator size is measured in millimeters, because it's a piece of silicon, then you suddenly only need to consider the size of your cargo. So, we're supposed to believe that traffic will become less of a problem, in a situation where there now exist compelling financial incentives to build extremely large vehicles, extremely tiny vehicles, and vehicles of every size in between?


This is a car.


These are also cars.

And keep in mind you can now use automated vehicles to transport anything.



And that cars can now have legs.



So this is what we're supposed to believe:

We're going to add an incredible level of variety to the sizes, shapes, and purposes of vehicles on the road, and in the skies. We're going to pilot them with software, not people, because software's cheaper and faster.

But this new, incredible diversity in size and purpose will not affect traffic patterns negatively. The software will not have any bugs, and if the software ever somehow does have a bug, those bugs will never be catastrophic, even though the software could be piloting any of an incredible range of possible vehicles, at any of an incredible range of possible speeds, while carrying any of an incredible range of possible cargo types — including human beings, nuclear waste, or angry bees.

Are you fucking kidding me?


The angry bees thing really happened. Fourteen million bees were released in a truck crash; the driver was stung countless times.

Here's a more likely scenario: these fuckers are going to crash a drone full of angry bees into a robot semi transporting nuclear waste, and then the whole thing is going to spill onto a fleet of robot Ubers, smashing the cars, killing everyone inside, and then turning the corpses into goddamn bee zombies with nuclear bee powers.

DON'T SAY I DIDN'T WARN YOU.

It's entirely possible the authors of this white paper were fools, rather than liars, but what they're saying is certainly false, one way or the other. There do not exist sufficient financial incentives for a perfect accident rate. An accident rate that keeps lawsuits from eating up profits is the most reasonable thing to expect. The higher those profits are, the more accidents they can subsidize. And that's what we can expect from the people whose code works.

But most people's code doesn't work. Most people's code almost makes sense, but not quite. So most software in very widespread use is doing things which almost make sense, but not quite, at scale.

Anyway, this report wasn't sponsored by Uber. But here's the billionth thing associated with Uber which is making my bullshit detector scream bloody murder. That's all I'm saying.

By the way, I love robots. I went to RobotsConf and piloted a drone with Node.js, and I loved every second of it. But I'm pretty sure I also crashed it into somebody's head at least twice.

Wednesday, May 6, 2015

Venture Capital Makes Us All Stupid (But So Does Moore's Law)

Most of the stupidest shit in hacking is programmers reinventing wheels. The new wheels start out square, and eventually become round. Sometimes the process of becoming round takes a very long time. CSS is one obvious example, but you see it over and over — in GUIs, in distributed systems, in language design, and everywhere else.

Avoiding the mistakes of the past is a very old problem with a very old solution: listen to people who have been there before. But old people are pushed out of the tech industry, where "old" means "over 30." I think there are two main reasons for this.

First, Moore's Law pushes virtually everything to become a computer. To computerize any given thing becomes cheaper and cheaper with every passing moment, and most things become more useful with the change. But as every object becomes computerized, the demand for programmers grows and grows, and it really shows no sign of stopping for quite a while.

It may slow or disappear altogether once computers learn to write code for themselves, but it may not. At the very minimum, the black market for illegal hacks will grow and grow and grow, irrespective of who wrote the systems being hacked, or indeed who wrote the hacks.

For the forseeable future, because each generation is "always" larger than the one which came before it, it's going to be a truism that programming will "always" seem overrun with kids. But because people are very prone to stereotyping and overestimating the importance of successful flukes, a field overrun with young people is a great petri dish for cultivating ageism.

And venture capital exacerbates this problem. They're gamblers. They like to think they bet on likely winners, but it's more accurate to say they bet on what they perceive as likely winners. In other words, they go for young white men from Ivy League universities, not because those individuals truly have any higher probability of success, because venture capitalists are human, and they're as susceptible to logical fallacies, stereotyping, and superstition as the rest of us.

But they're more influential than the rest of us. So, like their sexism, their racism, and their tendency to chase shallow fads, their ageism becomes the industry's ageism.

To be clear, they're not responsible for all of it. There are built-in factors which make ageism highly likely, if not necessarily inevitable.

But they make it worse. They hire young people straight out of college to reinvent wheels, badly, in huge numbers.

And keep in mind that Moore's Law is almost a force of nature, while venture capitalists are a group of people. Of the two forces that are making us stupid, one of them can be reasoned with (relatively speaking, at least).

I don't think these trends are very likely to dissipate, but it's worth it to try and get some sense into their heads. And if you're a VC looking for an edge, I have good news: wisdom is inherently valuable, yet it has a terrible marketing problem, and will probably continue to do so for your entire lifetime.

Eurorack Is Awesome

One of the most important technological tasks in electronic music is intermachine communication. You might have a drum machine, a synthesizer to play a bass line, another synth to play a melody, and a sampler to play a loop from an old 1970s funk song. You might want all these devices to play at the same time. Or you might just want your computer to tell them which notes to play.

MIDI is the ubiquitous protocol which most music machinery uses for communication. It stands for Musical Instrument Digital Interface, and it's a simple protocol which can only transmit very limited data. It replaced CV, which stands for Control Voltage. CV is more expressive than MIDI, but its original implementations were very unreliable and inconvenient. MIDI offered reliability, regularity, perfect timing, and rock-solid stability.

For years, a small group of hipsters and hackers have wanted to replace MIDI with OSC, which stands for Open Sound Control, but it's never really taken off. The protocol carries more data than MIDI, uses a URL-like naming scheme, and has a number of other significant advantages, but has failed to see much acceptance, enthusiasm, or awareness. My gut feeling is that it was just too complicated.

Meanwhile, CV's experienced an incredible renaissance. Where previous decades saw a great deal of incompatibile implementations, most CV-oriented gear today organizes around a common standard called Eurorack for voltages, machine sizes, and power consumption. Artists such as the Chemical Brothers, deadmau5, John Tejada, Orbital, Richard Devine, Nectarios, and Alessandro Cortini from Nine Inch Nails have embraced Eurorack, and Eurorack manufacturers have produced a ton of expressive, very powerful, and wildly innovative new instruments. Most Eurorack manufacturers are tiny companies — one of the best, Cwejman, is literally one person — but bigger names like Roland and Dave Smith Instruments have gotten involved in the past few months.

I'm not entirely sure what the lesson to learn here is. CV's renaissance stems from several factors:
  • Control voltage is an intensely simple API. (Control voltage is to synths what stdin and stdout are to Unix.)
  • Electronics are more reliable to manufacture today than they were when CV was first developed.
  • Vintage synth fanatics kept CV alive.
  • Advances in DSP and ever-tinier microprocessors make it easier than ever to build tiny, sophisticated instruments.
  • Software-emulated modular synthesis systems like Reason and Reaktor introduced a new generation to modular techniques.
  • Eurorack modules interact very simply and readily, while the Eurorack market is full of quirky experiments and ideas, sold in short runs. This combines intermittent rewards with scarcity, so it's a lot like if Lego blocks were sold the way Magic: The Gathering cards are. That's inherently addictive, and it's caused the market to grow.
Without diving into these causes in great detail, it's really hard to identify which are the most important driving forces. API simplicity looks like the best explanation, but it's also kind of a perfect storm.

Sunday, May 3, 2015

Two More Videos About Synthesizers

In this video, I show how a Microbrute sound works. I might use this for my upcoming class.



This is just a video of Mutable Instruments' Peaks and Frames modulating a bass line played by a Microbrute.

Three Gripes About Time Travel In Science Fiction

First, the grandfather paradox isn't real. All you do is get Buddhist with it. The moment is all that exists; time is just a way of tracking configuration permutations within the moment. Hopping out of one moment and into another, without travelling through all the intermediate moments, already suspends the allegedly tight coupling between time and chains of cause and effect. The grandfather paradox is for people who've never bothered to read quantum physics.

Second, while time may not truly exist in the classical sense, air certainly does. In normal life, when we travel from moment to moment, we can move into a new physical location by pushing air out of the way. If you suddenly appear somewhere because you've travelled through time, you suddenly share space with other matter. Although you might not, which brings me to the third issue.

If you're on a planet which is rotating about its own axis and rotating around the axis of the sun and located within a solar system which is itself moving through space at 45,000 miles per hour, then any time travel system would have to involve a lot of travel through space as well, unless its only purpose was to cause people to die horribly in the most scientifically impressive way. Because ten seconds ago, the entire planet was somewhere else. So if you push somebody into another point in time without also changing their physical position, they'll probably just be stranded in the vaccuum of space.

Like anything else, a practical time travel system would have to solve a lot of theoretically unrelated problems in order to be even slightly useful. And most of the "oh wow man that shit is deep" that goes on around time travel stories is just not.