Tuesday, February 05, 2019
How Alexa captured me as a customer and dragged me into the post smartphone era
Here's what happened. During Prime Day, I snagged an Audible membership for about $5/month for 3 months. I used it to buy several audio books, all of which were quite long. Google Assistant can start Audible, but for whatever reason it's unable to tell it to resume playing the last book I was listening to. Alexa on the Moto X4, however, not only can do that, but can also fetch the book I want to listen to by title and resume at the last known point. I took a look at the app and to my surprise, what the Alexa app on Android phone doesn't do is to start Audible and start the book, but instead, directly streams the audio from Amazon's server by itself without starting that app! Not only does this mean I don't need to have the Audible app (I do anyway so that I can cache books on the SD card), the latency is also much lower than having Google Assistant start the Google Music app and have it start playing. I haven't tried, but I'm pretty sure the Alexa app also streams music directly without starting the Amazon music app.
I shouldn't have been surprised, but because of my history working for Google and using Google products, I knew that in a million years, no Google product manager would take this approach. I systematically broke down how I ended up with no less than 3 Alexa products in regular use: the Fire TV Cube, the Echo Dot, and now my Moto X4. The Echo Dot was the easiest to explain: it was so Bowen could listen to audio books (again, from Audible).
The Fire TV Cube turned out to be a great entertainment center control device, and now replaces our Logitech Harmony Smart Control Hub, which I sold. Google doesn't have an equivalent unit, because to have one would be to acknowledge that other devices exist outside the Google eco-system, which apparently is a no-no, leading to the elimination of the headphone jack/audio output port from not just Google's phones, but also the Google Home smart speaker series of devices. Which meant that the nice speakers in the living room are now "owned" by Alexa, and so my wife added an Amazon Music subscription, even though all my personal music was sync'd to Google Music. Doubling down on higher end audio, Amazon is even launching an amplifier that supports Alexa.
Similarly, I ended up using Amazon Photos for RAW photo backup, because it was already folded into the Amazon Prime subscription, which bundled in TV shows for the kids that are turning out to be very good. And because of that Prime subscription (as well as the huge collection of books on Amazon), we now have 2 Fire HD8 tablets that the kids use as general purpose Android tablets as well as just video streaming.
I scratched my head as to how Amazon ended up with me as a loyal customer despite my background, and I realized that this was where Amazon's product design/product managers trumped Google's superior engineering. Sure, Alexa is not bilingual, while Google Home/Voice Assistant is. But since all that speech recognition is done in the cloud anyway, I'm comfortable waiting for Amazon to implement it eventually (or if it doesn't, we've learned to live with the limitations). But not having a device that can hook up to our entertainment system meant that Google Home speakers was never in the running for the living room. You can't upgrade hardware that doesn't have the proper I/O channels, while you can easily upgrade software in the cloud!
Similarly, not having a decent e-book reader meant that the default e-book reader of choice was always the Amazon Kindle, which has superb integration for my favorite book vendor of choice, the local library. And ever since Google abandoned the low end Nexus 7 tablets in pursuit of Apple-like prices (and presumably profit-margins) for Android tablets, that meant that the tablets would default to Amazon's ecosystem as well, since no one else is selling decent tablets at $50 each.
What would I do if I was a Google product manager trying to counter this onslaught? There are probably some things Google will never do, like produce a decent e-reader, so that's probably out. But bringing back a decent low-end Android tablet is probably something Google can do, since it has done so in the past. I'd bundle Google's services: Google shopping express, Youtube Red (or whatever it's called), Google Drive storage/Google Docs should all be bundled in together in one price. Put out a Fire TV Cube equivalent with sufficient control for other devices in the living room (an I/R blaster is enough) Even all that might not be enough, but at least it would make it feel like Google is trying. As it is, it definitely feels like Google doesn't have a coherent, integrated strategy where everything fits together, while Amazon does (and at a very high value to price ratio!). Google's strategy feels like a company that's chasing after Apple's customers, but with none of the integration, social prestige, and marketing prowess that Apple puts into its efforts.
Friday, December 02, 2016
Review: Harmony Hub & Echo Dot
A year ago, I bought the Amazon Echo and returned it. It was a great device, but didn't really justify its place in the living room. It was too big, and it didn't do very much, and it did a terrible job of voice recognition for my wife and Bowen. (The non-English speakers in the household obviously couldn't use it at all!) A year later, the Echo Dot is $50 ($40 during the holiday season), and it's basically the Echo stripped of the speakers, requiring you to plug it into the entertainment center's speaker system. That's perfect, since you likely have much better speakers in the entertainment center than any puny portable speaker will do. Much has been made about how the Google Home device is cheaper than the Echo, but the reality is that most people should really buy the Dot instead.
Out of the box, the device could control my Sensi thermostat. Realistically speaking, however, you're not going to adjust your home temperature that way. If your programming is up to par at all, you're going to tweak the thermostat at most once a month, and remember the voice command to do that is more onerous than pulling out the smartphone and running the app.
But once I got the Logitech Harmony Smart Control Hub ($70 right now on Amazon, which is a great holiday season deal), the Dot proved to be extremely useful. I'll summarize what the Harmony Hub does. It plugs into the wall, and you can program it with your computer or smart phone app to act as a universal report. What's great about it is that it accepts commands from your smart phone, a "simple remote", or another universal report via RF. That means it can sit inside a cabinet and still receive signals. It incorporates an IR blaster, which can then activate all the other devices in the same cabinet. For devices that are outside the cabinet (e.g., the TV), the device comes with an auxiliary IR blaster that can be plugged in and then run outside the cabinet.
Put it together with the Amazon Echo, and wow! With the old universal remote, it could power on the IR-driven devices, but couldn't turn on the PS3. Now, I'd walk into the living room and say, "Alexa, turn on Playstation." It would then immediately power up the PS3, TV, and Speakers, switching the speakers over to the PS3's output. "Alexa, turn off AV" would turn everything off. No more hunting for the remote, no pulling out the phone to switch to the app. As someone who's never cared about home automation (seriously, do you need voice control to turn on the lights?), this is truly a "Star Trek" living in the future experience. And no, Google Home can't do it because it doesn't support "external skills" yet.
The penalties: I still can't access my Google Music library, and I'd have to pay $24/year to upload all my music to Amazon's music library. That sucks. I'm sure at some point I'll break down and pay if music becomes important enough.
In any case, I highly recommend this combination. Given the sales during the holiday season, it's well worth the time to set it up.
Friday, April 29, 2016
Review: Matterhorn
One by one, you get to know other members of the company, officers, NCOs, machine-gunners and yes, the guy who's only got a few weeks left to go on his tour and is dreaming of going back to the girlfriend he left behind in Thailand. The story is good, with Bravo company getting screwed over by senior military officers who're trying to make themselves look good at the expense of the men they command.
If you're wondering why a Vietnam War novel might be relevant to a software engineer, I think this short passage might change your mind:
“You know why we’re really strung out in this fucking death canyon?” Mellas didn’t know, so he just grunted. “Because Fitch doesn’t know how to play the fucking game. That’s why. He’s a good combat leader. I’d literally follow him to my death. But he’s not a good company commander in this kind of war. He got on Simpson’s bad side because he got his picture in the paper too often and never gave Simpson credit, which by the way he doesn’t deserve, but that’s the point. The smart guy gives the guy with the power the credit, whether he deserves it or not. That way the smart guy is dangling something the boss wants. So the smart guy now has power over the boss.” (Loc. 3841-47)Over and over again, the novel doesn't flinch from the power politics that are played at high levels in a corporation (and in this case, the Marine Corps is just as functional or dysfunctional as any large corporation). At one point, Bravo company is tasked with digging trenches and building bunkers to defend a hill --- only to be told to abandon it to prepare for another assault elsewhere in Vietnam. Whereupon the North Vietnamese Army (NVA) promptly take over those defenses and Bravo company is then tasked with assaulting the very defenses they had built from a disadvantageous position. The poor Google engineers who built the very first version of Google Drive were similarly told to abandon it, only to have to launch again after Dropbox proved that the market existed and is pretty lucrative must have felt very similar to the marines in Bravo company. In fact, just as some of the high performing officers were unfairly blamed by their commanding officers for incompetence, I myself heard a Senior Staff Engineer at Google blame the former tech lead for Google Drive for failure to push against the killing of the project.
There's a passage where Mellas thinks about the Colonel in charge of the operation:
Mellas would probably have said that Blakely didn’t have what it takes, but Mellas would have been wrong. Blakely would have performed a lower-level job just as well as he performed his current job—competently, not perfectly, but well enough to get the work done and stay out of trouble. He’d make the same sorts of small mistakes, but they’d have a smaller effect. Instead of sending a company out without food, he might place a machine gun at a disadvantage. But the Marines under him would make up for mistakes like that. They’d fight well with the imperfect machine-gun layout. The casualties would be slightly higher, with slightly fewer enemy dead, but the statistics of perfection never show up in any reporting system. A victory is reported with the casualties it takes to secure that victory, not the casualties it would have taken if the machine gun had been better placed. There was nothing sinister in this. Blakely himself would not be aware that he’d positioned the machine gun poorly. He’d feel bad about his casualties for a while. But reflecting on why or for what wasn’t something Blakely did. Right now the problem before him was to engage the enemy and get the body count as high as possible. He wanted to do a good job, as any decent person would, and now he’d finally figured out a way to do so. He might actually get to use the entire battalion in a battle all at one time, an invaluable experience for a career officer. (Loc. 6174-84)That's the reality of management in a big organization, and an inherent limitation in the data-driven management techniques used today. Suboptimal code (or machine gun placement) sure as heck matters to the marines who get killed because of it (and to the engineers who have to maintain or work-around the problems), but it's not visible at all in the aggregate level to senior management. As a result, incompetent managers with serious political skills get promoted far more frequently than competent managers who lack such skills. In a high quality organization (like the Marine Corps or Google), the rank-and-file who get hired (or enlisted) are so good that they can make even incompetent managers look great. In fact, in certain circumstances, high casualties, constant war-rooms, and constant enemy engagement can make such managers look like stars, even though a better manager could have avoided all of the above. (And no, I have no idea whether the Marine Corps or Google's rank and file are really that far above average nowadays, but back when I was at Google, the average engineer was really really good, and in many cases much better than the average manager)
I'm at risk at this point of making this novel sound like a treatise in office politics, self-promotion, and lessons in how to make yourself (and your boss) look good rather than a great novel. Let me try to disabuse you of that. It's a great novel. It's got great characters, a transparent prose style, an interesting plot and setting. It explains why the North Vietnamese beat the Americans despite the latter's overwhelming technology advantage: the terrain and weather negated most of the advantages the Marines had over their enemies, and organizational dysfunction took care of the rest.
But at this point, the novel has won so many awards and accolades (it took 30 years to write and publish!) that anything I can say about the conventional aspects of the novel can be (and probably has been) better said elsewhere by professional reviewers. The novel delivers everything a novel should deliver, and provides lessons and entertainment in spades. I paid $2 during a Kindle sale for it, but knowing what I know now would not hesitate to pay full freight. Buy it, read it, and enjoy the heck out of it. And as you do read it, the management/political lessons it provides might turn out be really useful in your career. That makes this book highly recommended.
Monday, November 30, 2015
First Impressions: Sensi UP500W WiFi Programmable Thermostat
Then I found a deal for the Sensi WiFi Thermostat on Amazon. While my ultra-geek friends went for the Nest, if you're even a little bit skeptical, you'll find on-line horror stories about the Nest failing in all sorts of potentially dangerous ways. In particular, the requirement for a C wire is such that if you live in an older house or have a system that doesn't provide the C wire, you could potentially burn down your house, because the system then draws power from the HVAC control wire. Yes, one of my geek friends rents, so he doesn't care if the house burns down, but at least another few do own their homes. Assuming you survive such an event, of course, Google (which now owns Nest) has such deep pockets that you could probably recover the cost of replacing the house, plus make a tidy profit.
Why does Nest do this? Rather than require the owner/user to occasionally replace AA batteries in the thermostat, Nest includes a rechargeable battery in the device. That device, however, charges itself by drawing upon a C wire (or in the absence of such, the HVAC control wires). You would think that the product managers would specify, for instance, that the device in such a case should shut down rather than potentially burn down a house, but remember, this is the same company that decided that it would rather prevent you from being able to receive e-mail than to separate your photo quota from your e-mail storage quota.
Anyway, after determining that the Sensi wouldn't potentially burn down my house (it includes AA batteries, and you do have to replace those batteries occasionally), I embarked on the installation project. To do this, you download the Sensi app from the app store, which then walks you through the procedure: remove the old face plate, label the wires, unscrew the old wires, uninstall the old backplates, install new backplates, wire the labelled wires into the appropriate screw slots, install the new face plate, and then visit the WiFi settings on your phone. All through the procedure, the app holds your hands, even offering you videos if you should be unsure. This is more reassuring than most manuals.
The device then sets up a WiFi network which you connect to from your cell phone. Once that happens, your smartphone app then programs the device's WiFi settings, gives your device a name, and then pairs your device so you can now can control the thermostat remotely. I checked the heating and the cooling, and then proceeded to list my Hunter for sale on Amazon. (If you live locally and want my old thermostat, just drop me a note) I could install the same app on multiple devices, and any one of them could control the thermostat.
The device isn't fancy. It has no proximity sensor and doesn't learn when you're in the house or your habits. But as my wife points out, the reason for the thermostat isn't to replace human control, but to let us turn off the device while we're away and forgot to do so. That, and not burn down the house without our help or the help of our 2 sons.
All in all, I'm pleasantly surprised by how easy it was to install, and that it was possible for a complete newb like me to do so. Even if you bought this at the regular Amazon price instead of the deal I got, it's still half the price of the Nest. It's not fancy, but that means that those AA batteries will last a good long time. And if those AAs run down when you're on vacation, rather than running up the power, the device will just shut down WiFi and run on your existing schedule, which is what you want.
Recommended.
Friday, September 18, 2015
Indexing Google's Source Code
I wrote the first version of gtags in 1991 (yes, gtags is older than Google!), when I was at Geoworks. GEOS was several million lines of assembly, including every freaking app written for that OS. Since every object could potentially call any other object, the entire code base was relevant. Needing to get my head around that code base, I tried to build a TAGS database and that immediately caused my Emacs to start swapping. The performance was unacceptable.
The core insight was this: there's no reason to use a disk based search on a TAGS database. Stick the entire database into RAM, and use a hash-table to lookup your keywords, and search performance would go from multiple seconds (half a minute in some cases) to sub-second response time. So one weekend I coded up the hash-table, wrote code to load up a TAGS database into memory, and implemented a simple UI that let me browse code in Emacs. Soon, I enjoyed sub-second search times and could grok code that would have been impossible to do any other way.
If I ever needed validation that the tool-building approach to coping with large-scale software was the right approach, this was it. Once the senior engineers (remember, I was an intern at Geoworks then) got hold of the tool, I saw even loyal-vi users switch over to Emacs just to get their hands on the code browsing functionality (going from half a minute per search to sub-seconds was critical).
After I left Geoworks, most of my code was in C, C++, or other high level languages. Computers got so fast, and IDEs so sophisticated that I never dealt with a code base that couldn't be loaded into the IDE. It seemed to me that the need for such functionality had been obviated by ever more powerful machines.
That was, until I joined Google in 2003. By then, Google's code based was already approaching 1 billion lines, but in multiple languages. I needed to wrap my head around that code base in a hurry. Various teams were using random tricks to subset Google's code base into their IDEs, which I thought was a kludgy and unsatisfactory way to work. So in my 20% time, I rewrote my old tool using Google infrastructure (thanks to Craig Silverstein, who was the only person who believed in my code enough to waste precious time code reviewing it --- even then he was skeptical that my tool would be widely used or even substantially useful, given the huge amount of effort people had put into subsetting the codebase). I coded up the UI again in Emacs Lisp. I actually had to put some effort into the UI this time, given that C++ (and Java) overloading meant you had multiple search results for any given search term. Thankfully, Arthur Gleckler came in to lend a hand. Reading Arthur's Lisp code was like reading poetry: you can't believe the succinctness and elegance that can be expressed in so little space. It's worth your time to learn Emacs Lisp just so you can read Arthur's code.
Just as I expected, gtags took off in a huge way inside Google's engineering team. (By the time I left, 2500 daily active users was the metric, or about 25% of Google's engineering workforce. The internal tools team did a survey once and discovered that nearly every engineering workstation had a copy of Stephen Chen's gtags-mixer running on it) There wasn't a whole scale conversion from vi to Emacs though: Laurence Gonsalves stepped in and wrote a vim script that emulated the Emacs code. I don't even remember how I managed to do the code review for that checkin, but anything to help gtags, so I must have just gritted my teeth and done the code review or asked Laurence to find someone competent to review it.
But I wasn't nearly even close to done. Because of the huge amount of ambiguity and overloading involved in C++ and Java, I wanted gtags to do a better job of ranking the results of any given search. Phil Sung took a first crack at it, introducing Sung-ranking and later on, an include-rank that mirrored page-rank, except for code. Stephen Chen solved the problem of how to intermix protocol buffer files into the search results. Matei Zaharia (now a professor at MIT) spent a summer integrating a parser into the indexer for gtags, so that it was no longer a dumb lexical scanner but a full-on type-comprehension system for both C++ and Java. He also designed and implemented incremental indexing on Google's code base, no mean feat. Leandro Groisman and Nigel D'Souza both also made major contributions to gtags.
For several years, I had the entire Google source repository downloaded and checked out on a dedicated gtags indexing machine sitting under my desk. It was a standard underpowered workstation of that era: dual core, 2GB of RAM, and 500GB of disk: it had a special p4 client that eliminated the need to download any binary assets, since it was only interested in code! It was probably a major security hole, but I figured since Bill Coughran knew about it, I wasn't violating any corporate policies.
This illustrates a very important point: 2 billion lines of code sounds like a lot of code, but if you do the math (assuming 50 characters per line) you'll get only about 100GB of data (uncompressed). After dropping comments, white space, and lines that don't perform any declarations, your index is going to be pretty tiny, and you need to split that code base into several corpora (C++, Java, protocol buffer declarations, python), so each individual server could easily handle its entire corpus in RAM without any fancy sharding. Too many people get caught up in trying to apply fancy Google techniques required to manage terabytes of data when they're dealing with tiny amounts of data that fit into RAM and can be managed by traditional programming techniques.
In any case, gtags was a very hardware light project: it never took more than one machine to index all of Google's code base (and we never had to apply any fancy MapReduce techniques), nor did the serving cluster ever exceed more than about 10 machines. We came close to maxing out the RAM available on 32-bit machines for a while, but between Phil's string table optimization reducing memory use by 75% and the switch to a 64-bit architecture we never ever had to split indexes for any given language (there was a server for each language) across multiple servers. Those servers were under-utilized of course (they could probably have served 25,000 or 250,000 users at once), but on the flip side, you always got sub 10ms response times out of gtags. We switched from dedicated gtags server desktops sitting under people's desks to Google's cloud internally fairly early on, with Ken Ashcraft doing much of the work of converting gtags into a borg-ready service.
This came to a head when Google added the China office sometime in 2005 or so. After that, the powers that be decided that high intellectual property (HIP) code needed special permissions to access. Since I wasn't HIP enough, I simply stopped indexing that code. This burdened the HIP people so much that eventually some of them (including Sandor Dornbush) contributed to gtags. A HIP guy would take on the burden of downloading HIP code and indexing it using our indexer, and then put up the gtags server with that code behind a HIP firewall. The gtags-mixer would then be configured to talk to the HIP server and mix-in the result if you were HIP enough.
One of my prouder moments at Google was when Rob "Commander" Pike came to me and asked me how gtags worked. It turned out that he didn't want to talk to the gtags mixer or the gtags server, but just wanted his programming environment/editor to directly grok the output of the indexer. I was happy to give him access to the index for him to do whatever he wanted with it. I forget the mechanism by which this happened: he might have simply scp'd the index over to his machine, or I might have had the indexer push the index to his machine whenever it was done. This was great, because Rob became one of the folks who would notice whenever the indexer was broken because the file wouldn't get updated!
In any case, as with many things at Google, after I left gtags got replaced by some cloud solution that took way more resources than me, Arthur, and a bunch of interns, and I'm sure everything I wrote has been long retired by now, with the possible exception of the Emacs Lisp front-end.
Even after I left Google, gtags paid me back. Soon after I met my wife, she talked to some of her friends at Google about who she was dating. One of them did a p4 lookup on my changes, and said to her, "Hey wow, this guy has code commited everywhere, even the protocol-compiler." So I guess that worked out as far as a back-door reference check was concerned. (That change in the protocol-compiler was necessitated because I wanted to inject a clue in its output: that clue enabled the gtags indexer to map a generated C++ source file back to its original .proto form --- it was far easier to do that by having the protocol compile emit the clue than to try to guess --- it was a trivial change and Sanjay approved it in seconds)
If it seemed unbelievable to you that during that period of time I had such an illustrious group of people on a tiny 20% project, it should be. But I maintain that the test of a high quality engineering organization is whether or not that organization is able and willing to invest time, money, and effort into building tools that enable that organization to move faster and produce higher quality code. Google met that test and passed with flying colors.
Wednesday, February 11, 2015
Interviewing and Performance
My answer to that question is "no." What most of the other articles do not note is that Google actually does have data as to the major factor influencing on-the-job performance (at least, performance as viewed by Google's notorious promotion committees). It turns out that even in 2003-2004, Google had data indicating that your first tech lead at the company strongly predicted how well you would do in the future inside Google.
There are several reasons for this. One obvious one is that the promotion committee is likely to weigh your tech lead's comments on your performance more heavily than some other random engineer's. The deeper reason, however, can be found in the book, Chasing Stars. Fundamentally, all organizations have stated or unstated rules for how they work. Whether the on-boarding systems do a good job of explaining that to new employees and indoctrinating them in the culture very much explains future performance.
Google at the time when I joined was a largely oral culture. A typical noogler joining the engineering team would during his first week of working through the engineering training document find several bugs a day in the documentation necessitating a documentation change, if he were conscientious. Old documentation or out of date documentation was rampant, and the tech docs team had their hands full trying to keep up with the amount of code and internal APIs continually being churned. If you actually had to get work done, your most important tool wasn't the documentation or the internal search engine (which was laughably bad), but knowing who to talk to. For instance, if you needed to make a change to crawl, and your tech lead knew to say, "Go talk to Arup Mukherjee and ask him how you would do this", you were in luck and you'd be productive and efficient. If your tech lead said, "Go read the documentation," or worse, "Use the Source, Luke", not only would you waste a lot of time reading both code and documentation (as I was forced to once when dealing with the search results mixer), chances are when you were done you would probably have done it wrong, and your code reviewer would spend gobs of time correcting the way you did things, and forcing you to do everything over and over until you got it right. If that happened, you might as well kiss your "Exceeds Expectations" performance review goodbye. (And yes, I lucked into knowing people who wouldn't just tell me who to talk to, but walked me to their cube, provided introductions, and made it clear that what I was supposed to do was important enough to deserve help)
I'm fond of saying that context matters a lot when it comes to performance. This type of context-sensitive performance isn't necessarily because the tech lead deliberately led the poor engineer wrong. It was because the tech lead did not provide a suitable context for the engineer to work with, and in the process makes the job much much harder (or in some cases nearly impossible) for the new engineer. Hence if your interview process is successful in eliminating people who can't actually do the job, but you end up with variable performance or unexpectedly poor performance on the job from people who should be doing well, you need to examine your on-boarding process or the training process for your leads/managers.
The follow up to this question then is, "If performance is so context determined, why do we bother with interviews?" The answer to that is that the goal of the interview isn't to predict performance in the future. The goal of the interview is to ensure sufficient technical competency and cultural compatibility so that with a good on-boarding process or a decent tech lead/manager, the new engineer ought to be able to do a great job. Hence, when I run interviews, I don't favor esoteric problems that require dynamic programming (for instance), but basic data structure questions. While I consider basic tests such as the Fizz Buzz Test way too simple and insufficiently indicative of someone with basic computer science knowledge, coding questions that approximate that level of complexity (while still testing basic computer science concepts) is all that is typically needed to weed out people who simply can't code and shouldn't be allowed access to your source control system.
Monday, October 06, 2014
Gaming the Coding Interview
For instance, if you read Cracking the Coding Interview and were diligent about it (i.e., actually worked through the problems and practiced at them), you'd stand a good chance of doing really well during Google's interview process. Lest you think that this is a recent phenomena, even in 2003, Google's interview process was very similar. I remember being asked to reverse all the words in a sentence, and a few other puzzler type questions, and even during my interview, I remembered one interviewer telling the next one as the hand-off was happening, "this guy knows all the standard interview questions." Back then, Gayle's book didn't exist, but 10 years of interviewing for startups and interviewing at startups had hit me with every interview question that could be easily covered in a 45 minute session.
I will note that Facebook does have tougher interviews today than Google (they're hiring slower and therefore can be more picky), but from what I've seen their interviews are no less subject to being gamed.
When I look back at the interviewing process, there's really only one company that's stood out for having an interview process that couldn't be easily gamed, and that's Wealthfront in late 2012. I only include the date because in between, startups can change a lot and for all I know they could be interviewing like Google today.
The way Wealthfront conducted their interview was by pair programming. The candidate would come in, and pair program real problems with their "interviewer". The experience is intense, and in many ways eliminates the possibility of hiring someone who couldn't even write correct java syntax, or construct unit tests for code he'd just written. It's a good way to go and difficult to game, since you have to actually be able to design, structure, and turn ideas into code all the way to the testing and debugging steps.
Another good idea I've seen at certain startups is to put the culture fit interview first, before any technical interviews get done. The reason for this is if you get a candidate who's stellar on the technical side, it's actually very difficult to reject him for cultural reasons. I can attest to this, as one of my early hires at Google bombed out precisely for that reason, though without doing much damage. By putting the cultural fit interview first, you eliminate the bias to hire, even though you might waste a bit of time.
Thursday, September 18, 2014
Exploitation
Many of my former colleagues have said something like "I wasn't really exploited. I'm going to donate my money from the settlement anyway, so it doesn't matter how much it is." This tears me up. It tears me up especially since the kind of people who say that tend to be white, privileged, and have never really had to struggle to make a living.
When I was 20 and a struggling student (yes, I actually did receive Pell grants), I had to work 2 jobs simultaneously while carrying a full time load to avoid having to take out crushing amounts of student loans. I had a deep aversion to carrying debt at that time and I still do now. I worked for a tiny company in Berkeley called Geoworks over the summer full time as an intern. Geoworks was your prototypical technology startup, and had lots of cool projects, including the idea that you could work on whatever you want and no manager would stop you provided you got your main job time. Of course, that meant that many of us worked 80-100 hour weeks for fun. (Google called this 20% time, Geoworks called it "anarchy time") In fact, the predecessor for gtags was a tool I wrote during anarchy time for Geoworks to browse and navigate the multi-million lines of assembly that encompassed GEOS. For all that, I was getting paid a nominal $15 an hour, but working way more than the 40 hours a week. I think I might have clocked 80 hours a week normally.
At the end of the summer, I was due to go back to school. The management team at Geoworks took me aside and said, "You'll be working fewer hours, and so as a result, we're going to cut your hourly rate because you will not be as focused on your work as you were when you were full time." They proposed to cut my pay to $12 an hour, in addition to giving me only 20 hours a week. I was by no means someone they were trying to get rid of, since they would try to hire me again next year as a full time engineer after I graduated. I was hopping mad. I quit and worked as an undergraduate TA at school instead, reasoning that if I was going to be exploited (Berkeley only paid $10/hour), I'd rather be exploited by a non-profit and help my fellow students and avoid the walk to downtown Berkeley and stay on campus instead.
Years later, other former interns from Geoworks would thank me for my actions, because after seeing someone they thought was loyal walk out over that 20% hourly rate cut, management at Geoworks backed off on that policy.
What relevance does this have today? Back then, tech workers were plentiful and companies didn't need as many. There wasn't as much competition back then as there are now for workers. You'd think that, but you'd be wrong. Just a few years back, one of Google's early SREs left Google and joined Facebook, based on something very similar to my story above. After that event, Google gave everyone on his team a raise. Was that competition helping out? Or was it simply because Facebook refused to join the cartel that Google, Intel, Apple, Adobe, and several others put together? Regardless of how you feel about Facebook as an employer or product, engineers in the valley owe a huge favor to Facebook walking in and breaking the cartel and raising wages as a result.
Here's the thing: Google and Apple have engineers that are the strongest in the industry. You would think that it would be impossible to exploit such an incredibly valued bunch of folks, yet these large corporations got together and did it, and successfully got away with it, getting a slap-on-the-wrist settlement from the government. If these companies get away with murder when it comes to Google-class engineers, what do you think happens to the women and minority in the profession who aren't in the top tier? That marginal worker on average discovers that the low pay and long hours common in the profession does not pay enough to keep him or her working in software development. As a result, the average software career is much shorter than careers in other engineering professions, allowing the industry to claim a shortage.
I don't care if you personally don't need the money from the settlement (I don't, either). But when exploitation of workers happen, call it out. Don't sit back and behave like a spectator: let everyone knows how unfair it is, and how it shouldn't be allowed to happen. By doing so you're not just helping yourself, you're also helping engineers that aren't working at your tier. Otherwise, all the noise about trying to get more women and minorities into the profession is just noise; until you can get fairness in the workplace for the top tier workers, you don't have a prayer of making it attractive for the marginal tech worker or helping those who aren't in the 1%.
Monday, April 28, 2014
Review: Android Studio
Android Studio is based on IntelliJ IDEA. Back at Google when I was doing Java work, I avoided it like the plague, preferring to stick with Emacs and gtags. That's because Google's Java source base was so big you couldn't possibly load it into IntelliJ on the puny workstations of that time (yes, Google only supplied machines with 2GB of RAM in 2003), and even if it had been possible, those machines would have slowed to a crawl under the load of processing that much code. IntelliJ/Eclipse die-hards were resorting to wacko tricks like subsetting Google's code base so it could load into IntelliJ and then writing plugins into gtags for accessing the rest of the source code. I have no idea what Googlers do today, but my suspicion is that things haven't gotten much better.
For small Android projects like Nutrition Tracker, however, an IntelliJ is just about right. If you're unfamiliar with the Android API, it would supply you with method name completion, tell you which arguments to supply in which order, automagically add imports, allow for automatic refactoring tricks such as moving methods, renaming variables safely, and moving inner classes out of their outer classes, shifting classes between packages, etc. The layout tool helps you avoid having to learn the lame layout XML language, so you can actually try to make things work (as opposed to making things look pretty and usable --- I think Emacs is a great UI, so I have no expertise on those topics).
Android Studio is slow. It's slow to startup, it's slow to compile, and it's slow to run the debugger. A typical edit-compile-debug cycle would take around 10-20 seconds in order to build a tiny app. Note that I'm not complaining about Android Studio's massive use of resources while I'm editing. I think that's entirely appropriate. I want all my 4 cores/8 threads to be utilized in order to make my coding experience faster and more pleasant. I don't even mind the startup, since it doesn't need to happen that frequently, and it's a one time cost. But the Gradle build system is not only a resource hog, but it introduces additional latency into my think-time, so I begrudge every second it's spending traversing dependency graphs instead of actually compiling code. I have no idea why the Android Studio engineers chose a clunky system like Gradle, as opposed to rolling their own and integrating it fully into the IDE. I never want to edit the gradle build files manually, but the system forces me to. What's more, the syntax is really obscure and the documentation is inadequate.
For instance, when doing an android release, the documentation only covers Eclipse. Worse, the documentation lies to you. It tells you to modify your Manifest file, and I did. Until I kept scratching my head as to why that never worked. It turned out that you had to modify the Gradle config, since the Android Manifest XML file was ignored in the presence of Android Studio. Well, that took so much googling around that I can't remember what search term I used to uncover the Stack Overflow answer any more.
The source control integration is also funky. It supports Git, Mercury, and Subversion, but not Perforce. Given that Google uses Perforce internally, I surmise that Google's internal projects do not use Android Studio. This does not bode well, since that will mean that Android Studio's most serious problems (build performance) will most likely never get addressed because its non-existent internal customers will not feel the pain.
For quick and dirty Android projects, Android Studio is probably the best there is. If you're serious about building an Android app, however, my suggestion is that you use Emacs and roll your own build system that's decently fast. Otherwise, the benefits from using an IDE will be swamped by inordinately long compile/edit/debug cycle times. Note that though my machine is old, it's still decently powerful compared to even the fastest rig today, let alone the kind of laptops most "hip" developers favor, so it's unlikely you can solve Android Studio's problems by throwing more hardware at it.
Recommended only for non serious Android projects. It's a great tool for getting started quickly though, so use it to bootstrap yourself into doing Android development if that's one of your goals.
Wednesday, December 26, 2012
First Impressions: Republic Wireless Defy XT (Dual Band)
Monday, December 24, 2012
Frank Spychalski MIA in New Zealand
The police in New Zealand have been contacted and have started searching for him from Wanaka, where he last posted. A backpacker there said he mentioned wanting to go to Mt. Aspiring's French Ridge hut.
Frank is a strong and careful hiker, but if there's any time to start worrying it's now. If you've heard from him or talked to him since November 26th, please let me know. Additional information could save his life, not to mention his friends, colleagues, and family a lot of worry.
Friday, October 12, 2012
Long Term Review: Nexus 7
Here's what happened: after I used the device a bit, it started getting sluggish. I wondered why everyone else was raving about the device. While a factory reset would get it fast again, I really did not enjoy having to reinstall the device every single time. A search for "Nexus 7 slow" didn't return many results, but a search for "Nexus 7 sluggish" granted me a Forum entry, which in turn led to this PSA.
Fundamentally, the 16GB variant of the Nexus 7 ships with defective eMMC packages. That means that if you load up the 16GB variant until there's less than 3GB of storage left, the machine grinds to a quick and sudden halt. Let me describe how slow it is: I could barely factory reset the device because it would not recognize my drawing the unlock pattern in order to do so. I had to reboot the device, and during the window in which the device was still (relatively) responsive, reset the device.
Some people reported that a factory reset sped up the device. Not so for me. I ran Androbench after a reset, and it reported a random write speed of 139 IOPs. (A standard N7 should be capable of 7000+ IOPs)
I called Google support and asked to return the device so my brother could get his money back (it was a birthday gift). Turned out I was 6 days too late. I also could not get the 8GB variant of the device instead, which does not suffer from this problem. The customer rep assured me that this was a rare problem, but given the amount of traffic on the internet about this issue, and a verification with a friend of mine who bought the 16GB Nexus 7 also had the same problem. I'm willing to bet that it's not a rare problem, but just a problem that's so subtle that many customers just live with it, not knowing that the product isn't supposed to be this sucky.
You might wonder why Googlers don't have this problem. It turns out that most Googlers have only the 8GB version of the device, and the folks I spoke to said they mostly used theirs to check e-mail. This is the same reason why Apple didn't find out they had a maps problem --- they didn't actually have any power users on their dogfood list.
I understand that there will be screwups, I understand that no product can be perfect. However, Google's customer service clearly doesn't reflect the reality of the situation, which is the 16GB Nexus 7 is a dud and a defective product from the get-go, and never should have shipped. I am therefore rescinding my "recommended" tag on the Nexus 7 review.
If you want a tablet and need more the 8GB of storage, get a Kindle Fire or an iPad instead. (Frequent readers of this blog are aware of how much it hurts me to recommend an Apple product over an Android product) Google (and Asus) should be ashamed of themselves for foisting off such garbage onto the world.
Saturday, July 28, 2012
Review: Orange Internet Max (France)
Last year, I had trouble getting even regular voice SIM cards, let alone Internet capable SIM cards. This year, however, we started our trip in Paris, albeit on a weekend. On a Monday, however, I went to an Orange store and got an prepaid SIM card. It cost EUR 9.95. I bought a 10 EUR refill right away so I could subscribe to the Internet Max plan (which was 9 EUR, but the Sim card only came with 5 EUR credit, and the minimum refill was 10 EUR). It's an unlimited data subscription plan that's good for a month and automatically turns off if you don't have enough credit to resubscribe! The worst part of the experience is that part where Orange tries to pretend to be Apple. You walk into the store, and are greeted by a pretty woman dressed in Orange uniform, who will put your name in a queue (driven by an iPad) so you can browse the store until a customer service rep is ready to talk to you. Unfortunately, they did this Apple-emulation strategy wrong: they had too many pretty women, and not enough customer service rep, so I ended up cooling my heels for at least 25 minutes before being able to complete an incredibly simple transaction. I would have preferred standing in line like at a normal store.
What an awesome plan it is. Most of the time, the speed is fine. Much faster the the iPhone 4G that I got as part of the home exchange program we participated in. And of course, any Android phone runs circles around the iPhone as a matter of practicality. Being able to get turn by turn navigation saved our bacon several times while driving (or walking!) around France. We were also able to tether the phone to the laptop whenever we were at a hotel without internet. Try this with your post-paid plan in the USA for less than $25/month!
The best part about this is that while Orange will try to charge you separately for e-mail, if you're using an Android phone, there's no need to pay for the e-mail plan separately. That's because the Gmail app on Android uses http requests, so it looks like browser traffic to Orange, rather than IMAP/POP, which is what Apple products use.
As an aside, after using an iPhone side by side with a 2 year old Nexus One running Android 2.3 (no I haven't bothered to upgrade the default OS yet, and probably won't --- I'm cheap with my time as well as money), it's no contest. I'd rather have a 2 year old Android phone than an iPhone when I'm in a foreign country and in need of navigation, search, and making phone calls.
Recommended. An Orange store should be the first thing you look for when you land in France.
Sunday, May 13, 2012
Sharing on G+: It doesn't always mean what you think it means
This is annoying, but hardly the end of the world. I really don't care about privacy, and it's very likely that the future generations of internet users will care much less than the average current user as well. What truly annoys me is when somebody mis-understands the use of the circles sharing feature, and shares a previously public post as a non-public post. If I like that post, and then try to share it, I get a big red sign saying, "No, you're not allowed to share this as public, all you can do is to share it to your Extended Circles." As previously noted, the extended circles is almost as effectively public as Public, but not quite. But darn it, the original post was Public. Just because one of my "privacy conscious" friends (who isn't actually privacy conscious --- see above) didn't choose to share it publicly doesn't mean that I should have to go hunt down the original poster and search for the post and then repost it if I want it back to its original status, Public.
I'm guessing most Google+ users aren't as annoyed at this as I am, but each time I run into a post that was originally Public that I can't share publicly, it screams to me as: "Google+ designers and engineers can't keep track of the original status of the post, so now you have to do it for them." And don't blame the users. The users think they're sharing privately.
Thursday, May 10, 2012
D&D at Google
Ironically, one interviewee I once spoke to rejected Google's job offer because he felt that while he would fit in at Google if he was nerdy and played D&D, he didn't think that as a ballroom dancer he would fit in. He was thoroughly wrong. Ballroom dancing has always been and will probably always be more popular at Google than D&D. In terms of social acceptability, of course, there's no contest: ballroom dancing simply doesn't have D&D's stigma attached to it.
In any case, someone on the SRE team bugged me and bugged me about running a D&D game at Google until I gave in and announced that I was willing to run one. At which point she promptly backed out of being in it. Nevertheless, I started the game in November 2005, and it ran until the end of 2007, with players shuffling in and out. The players included at one point or another, Paul Tyma, Shyam Jayaraman, Taylor Van Vleet, Ron Gilbert (who didn't actually work at Google), Tom Jiang, Neal Kanodia, Roberto Peon, Mike Samuel, and various drop-ins at one point or another.
One innovation that I got from my pre-Google days was to start a blog with in-character descriptions of the game. I would award experience points for writing the blog entries, which were very very fun. Ron, in particular, would draw cartoons involving his character Deathspank and members of the party in their exploits, including some very unheroic moments. Unfortunately, Ron has since yanked the cartoons from the blog, so I'm afraid you won't get to see them.
At the end of 2007, I wrapped up the campaign after all the characters hit 20th level, and moved to Germany. That ended my involvement with D&D at Google. Just yesterday, Tom told me that there hasn't been an epic game like mine since. It was fun and challenging DMing for Googlers (I minimized prep work by running from pre-written adventures whenever possible), and I enjoyed every minute of it. It definitely taxed and challenged my organizational skills to keep the game going for so long, and I definitely felt like I lost control at the end when the characters got too powerful. But that's a fact of the game, and since then there's been another edition of D&D that I have not bothered to play with or pick up. It might be that for me, D&D is something that happens every odd edition.
Tuesday, May 01, 2012
Independent Cycle Touring at Google
Independent Cycle Touring in Europe:
Imagine pedaling through quaint mountain hamlets in Switzerland’s Bernese Oberland, past fields of wildflowers in Germany’s Black Forest, along the shores of lovely lakes near Salzburg in Austria, or high above the Mediterranean in the French Pyrenees… With its diverse landscapes, vast network of roads and cycle paths, and bike-friendly accommodations, Europe is a fantastic cycling destination. Tonight, independent cyclist and guidebook author Piaw Na will share his expertise on planning bike tours in Switzerland, France, Austria, Germany, Italy, England, and Scotland. Piaw will cover the nuts and bolts of organizing an independent tour, including route-planning, seasonal considerations, lightweight gear, training, transporting bikes on planes/public transit, navigation tools, accommodations, and more.
The organizer, Anna Walters, is open to allowing outside visitors to attend the talk. If you are not a Google employee, please RSVP to me by the end of the week by leaving a comment on the blog so we can get a headcount, and Anna can see if Google is willing to accommodate that many visitors.
Googlers: I will keep the Q&A period relevant only to the talk, so bring your cycling questions. I worked at Google for many years, and remember teaching the League Road 1 course on campus in building 42 one evening. I was demonstrating how to fix a flat when Wayne Rosing walked by and peaked into the conference room we had commandeered for the session. He smiled, shook his head, and walked on by.
This talk was very well received at REI, so I look forward to giving it at Berkeley, where I launched many many bike tours (as well as supporting wheel building sessions). If you can make it, please come.
Friday, February 03, 2012
A Surprising Change in Google+ Engagement
Well, I went back to look at the month of January and wow, what a change a couple of months have made. Google+ is now right on top of my referrals at 262 visits, versus Facebook at 235. Quora is in 3rd place at 168, followed by linkedin. Of course, Google's organic search trumps everyone at 3000 visitors over the same period.
I have no idea what's caused the change, though there is one clue: new visitors from Google+ comprise a much smaller percentage of the referred volume than they do from Facebook and Quora. What this means is that most of my friends (a lot of Google affiliated people) have migrated over to Google Plus, probably from Google Reader, since Reader no longer has any social features.
I'm not sure what this means in the long haul: I suppose I could duplicate-post my current Delicious Feed onto Google Plus for a bit to see if engagements goes up even further, but my suspicion is that it will have zero impact.
Regardless, it's clear: with engagement going up over the last 2 months, I cannot ignore Google+, even though I would have much preferred Friendfeed to win, for instance.
Sunday, December 11, 2011
Referrals: Google Plus versus Facebook versus Quora
Far more important than the sheer number of users, however, is the engagement of those users. For instance, while Buzz got pushed to nearly every Gmail user, most non-Google affiliated users told me (on Facebook, no less) that Buzz was a dead zone for them. So I used my own blog analytics to see whether Google Plus users got referred to my blog.
The number one referral (visitors who came from another sitee) to my blog came from Google.com. That's Google+, right? No. It turns out to be Google Reader. The next most popular referral was Facebook, which was almost as popular. (Though Google Reader users are really engaged: they visit 4 times as many pages as Facebook readers, who presumably see the link, click to read the article, and immediately leave) The next best referral came from Quora, the question answer site. This came as a bit of a surprise. Then the dropoffs become really steep, my own Books web-site and Friendfeed, a so-called "dead" service. (As an aggregate-site of all my online activity, Friendfeed beats the heck out of all the other sites)
By the time I got to Google+ and Hacker News, I'm down to one fifth the visitors that Google Reader sends me. This is incredibly low. I doubt if Quora has 200M users, but their users are incredibly engaged, unlike Google Plus's.
Here are a few lessons I would draw from this:
- RSS support is really important. Reader, Facebook, Quora, and Friendfeed all support RSS export or RSS import so you can track somebody's content. Google Plus insists on you manually typing in a share with no method of automation. Even Twitter supports an auto-export from my blog to my Twitter stream. While I do try to promote blog posts on Google Plus, I don't always do so, especially for book reviews.
- Google Plus is still extremely niche. Even though I'm not active on Hacker News, for instance, it's still way better at sending me referrals than Google Plus.
- Twitter messed up. I have no way whatsoever of tracking Twitter referrals at all. As a result, it's not surprising that I rarely find time to engage on Twitter. But, because of the automation provided for sending blog posts, etc, automatically to Twitter, it costs me nothing to twitter my blog posts, so I do it. Which goes to show that automation will make up for other poor decisions on the social network front.
- I really miss the old Google Reader. The old Google Reader gave me 2X the engagement of the current Google Reader, according to the referral logs. It's a pity Google was willing to give up all that engagement, but I'm guessing that as usual, small fry like me don't count for very much.
Sunday, November 27, 2011
Review: I'm Feeling Lucky
Well, I finally checked out the book from the local library, and I'm glad I did. First of all, it's cool to see names and people you're familiar with. For instance, upon request, Edwards provided a pseudonym for a well-known engineer, "Claus." Well, as a Googler, it would take you all of 10 pages to figure out who "Claus" was, so what's anonymous for others isn't anonymous for you. Secondly, as a Googler, some mysteries are solved through stories from the old days. For instance, if you've always wondered why a certain executive is hated, this book explains why that person wasn't just hated by his/her reports, but also by other functional teams. It even explained to me why a certain engineer, despite his critical role in the company (and was one of the first ten employees) was denied refresher options and essentially told to leave. If you're a current or ex-googler, this sort of gossip is fun and explains certain behavior that has roots somewhere in the murky past and which makes no sense today and (in many case) didn't even make sense back when I joined in 2003.
This is primarily a book written from a marketing person point of view. Furthermore, it's written by Google's brand manager. You're not going to find the sort of nitty gritty technical details that would please someone whose life was devoted to Hacker News, for instance. On the other hand, the business milestones are documented in great detail: the AOL deal, the Yahoo deal, and the various Overture deals. Unfortunately, you're not going to get a lot of strategic insight: Edwards wasn't privy to those, and a 20 minute conversation with PengToh would do you a lot more good than reading this book if you wanted those.
Nevertheless, Edwards does provide some insight into the engineering organization. For instance, Google is famous about not providing positive feedback inside the engineering organization. I've met retired ex-Googlers worth multiple tens of millions in net-worth who still seem emotionally scarred by the experience of doing amazing stuff that never got any recognition. What I didn't realize at that time was that this is part of Google's engineering DNA, buried deep inside its founders and early employees. If you hire former Google engineers, read this book, and you won't be as surprised as some Facebook managers who told me, "I thought I was getting a good engineer, but I wasn't prepared for how political Google engineers got as a result of their never received proper recognition inside Google and having to fight for any sort of recognition as a result." That makes the book well worth reading for this insight alone, not just in case you happen to hire Google engineers, but to also ensure that your engineering culture doesn't end up like that, because while obviously it didn't hurt Google, there was no need to do this to otherwise valuable people.
All in all, I think this book is well worth reading, and definitely worth paying the Kindle price for. If you're affiliated with Google at all, I would encourage you to read this book. If you've got even a modicum of curiosity about Google, this book is so well-written that you will not feel like you've wasted your time. Recommended.
Sunday, November 06, 2011
Piaw's Feed Survey Results
85% of people used Google Plus as a social news site. 70% named Google Reader in second place. Not surprisingly, the same 70% said RSS input was important to them. 60% would prefer to read my feed on Google Plus, but 70% would take RSS (75% would consider Google Plus acceptable). 50% don't use Facebook.
Unfortunately, Google Plus is extremely unfriendly for sharing. You first have to +1 every post (which is stupid, because I might share articles I disagree with), and secondly, there's no RSS export, which means I can't with one click share to Twitter, Friendfeed, etc. This is a fatal flaw which I don't expect Google Plus to fix any time soon, since it's central to their "we want to be Facebook/Roach Motel" strategy. However, Delicious does support RSS export, has a bookmarklet that's not insane, and does allow me to comment, though it doesn't allow general replies, etc.
As a result, from now on, you can find my feed here on delicious. Maybe someday there'll be a reasonable write API to Google Plus, or Google Plus will support RSS output. (As well as a bookmarklet that's not insane)
If you want raw data, you can view it here. Thanks to everyone who participated.