Auto Ads by Adsense

Booking.com

Showing posts with label software. Show all posts
Showing posts with label software. Show all posts

Thursday, April 12, 2018

First Impressions: LG V20

To be honest, I didn't purchase the LG V20 to use as a cell phone at all! What happened was that I was shopping for a high res music player. In particular, I wanted LDAC or AptX-HD support as well. But even the cheapest high res music player from Sony was $218 and wasn't capable of say, playing back movies or doing anything interesting except for playing music.

The LG V20, however, has always had great reviews for audio, and during an eBay sale, you could get one for $135 after a coupon code. That's even cheaper than a refurbed or used Sony, so I bought it, reasoning that at worst, I had a music player that could double as a portable movie machine on a plane as well.

The biggest problem with buying used or refurbished smartphones is the battery life. Lithium Ion batteries have a limited lifespan usually measured in cycles. If the battery on the phone is regularly drained, within a year, the battery would have gone from barely making it through a day to not making it at all. But the LG V20 was the last of the flagship phones that has a removable battery, so that concern was not an issue either.

Through a freak accident, on the same weekend my LG V20 arrived, the camera on my Moto G5 Plus was smashed. My best guess as to what happened was that I had the Moto G5+ in my cycling jersey pocket, and my kids pedaled the tandem right into my back and the nose rivet of my brooks saddle smashed the lens like a hammer. In the old days, losing a smartphone camera would have been a "so what" event, given that I have dedicated cameras and are not afraid to use them. But nowadays all sorts of apps on the phone depend on the camera, including the all important check deposit app.

So when I got home and unboxed the LG V20, I didn't just plug it in, I removed my sim card and sd card from the Moto G5+ and put those into the phone as well. I did eventually repair the Moto G5+, but it was a ham-fisted repair and didn't really restore the camera to full functionality (the camera's output is still marred).

The LG V20's SoC, the Snapdragon 820, dates from the same era as the Moto G5+'s, the Snapdragon 625. In all the benchmarks, the 820 runs circles around the 625: not only are the cores fully custom Qualcomm "Kryo" cores, they're also clocked higher. In real life, we run applications, not benchmarks, and the LG V20 doesn't feel appreciably faster than the Moto G5+ did, just more power hungry. With one exception: RideWithGPS's route planning web-site does feel faster, because the single-threaded Javascript website just run faster with higher single-core performance, demonstrating that for native Android apps, the 625's 4 additional cores make up for each core not being as fast as the 820's faster cores.

In exchange, the phone's battery life is abysmal compared to the Moto G5+. While I could regularly charge the Moto G5+ to 80% and make it through the day, there's no way the LG V20 could do so. In fact, if I charged it to 100%, it might make it to 6pm before begging for more power. And that's without doing anything expensive like navigation. I immediately bought a 2nd battery for the phone. Given that the phone's battery is replaceable, I could charge the battery to 100% each time without worrying about longevity.

The big pluses are significant. First of all, the audio is indeed awesome. I'd come close to retiring my Sennheiser HD 600 headphones, because they weren't appreciably better than other random stuff I had sitting around. Plug those into the LG V20, and wow. OK, I just didn't have suitable amplication to drive them before! They sound awesome! The bluetooth stack seems better engineered as well, dropping much less frequently than the Moto G5+ did when playing music wirelessly (and the device will use AptX-HD if your headphones support it). My garmin watch also disconnected much less frequently and at higher distances than Moto G5+ did. It does connect with Apt-X HD with  my Sony X1000/M2, but I don't think I can actually hear the difference between Apt-X and Apt-X HD.

The bigger screen is better, though I'm not sure I notice the resolution increase. I didn't miss having NFC on the Moto G5+, but it's actually fairly useful, not just because of Android Pay (which is mostly a gimmick --- you still wouldn't leave the house without your wallet, not just because your driver's license and health insurance cards are in there, but also because enough vendors still don't take Android Pay that you'd be stuck without a payment method in the worst possible places), but because of the "tap to link" camera implementation that Canon has implemented in both the M5 and the G7X2. Now that's a feature that no iPhone has. The NFC antenna/chip is implemented in the back cover of the case (near the top), rather than the battery (like some Samsung phones), so you can swap out the battery without losing NFC, a very nice feature. You can even buy a Murgen 9300mAh extended battery that comes with a new cover and the NFC chip for an extended run-time, though apparently the added weight of that battery means that the phone is no longer mil-spec for drop purposes, and you can't find a protective rubber case for the phone if you attach the big battery.

The fingerprint reader's on the back of the phone, which is useful when picking up the phone off the desk, but not useful if you're trying to use it on a table while eating breakfast, for instance. I also miss the "touch gestures" that Motorola implemented on the Moto G5+, which saved some screen real estate. That's made up by the fact that a 5.7" screen with .2" lopped off for the navigation buttons still gives you a 5.5" screen.

The camera is meh. It's not nearly as good as the Moto G5+'s, which surprised the heck out of me, given that the LG V20 has 3 cameras (2 front and 1 self-facing). I was also surprised by the lack of a selfie-flash, which was present in my wife's Moto Z Play, another phone that's also not in the same price range. It's also not waterproof, but again, if it didn't have a user-swappable battery, I wouldn't have even given the phone a second thought --- my experience buying a refurbished Samsung S7 was that refurbished phones are worthless not because the phone's not functional, but because battery wear usually renders the phone useless:  it doesn't matter how many cool features your phone has if the battery is dead.

The second screen on the phone is also pretty worthless -- it just doesn't add enough usability to the device for me to value it highly, and it feels that it's just using up power for no reason. The LG V30 probably eliminated that feature for this reason.

I do miss the Moto G5+'s gesture: twist to shoot, shake to turn on flashlight. The LG equivalents are clunky: you tap the volume down button twice to activate the camera when the phone is locked, but because of where the buttons are positioned, I have to use my thumb to do that, which is ergonomically unsound. Maybe if I was left-handed it would work better. And the flashlight has mysteriously turned on in my pocket for no reason I can discern, and then it's a bear to turn off requiring unlocking the phone and multiple gestures.

For those who care, the V20 does get excellent updates for the software. After I booted up the phone, it immediately popped up update notifications, and a few days later gave me yet another security update. The phone's even supposed to eventually get Android Oreo. The Moto G5+, by contrast, got maybe 2 updates in the nearly 1 year period during which I owned it, and even though it too is supposed to get Oreo, it's quite clear at this point that Motorola isn't only going to follow through on that reluctantly, if at all. My wife's Moto Z Play does get fairly regular updates, however, so this is entirely due to the price/tier of the phone rather than Motorola's inability to keep up with Android revisions.

All in all, there's no way this phone was worth the $500 premium over the Moto G5+'s price when both were new. And I wouldn't pay more than the $135 I paid for the LG V20. But at the price I paid, I'm somewhat OK using this phone. It's got some pluses, some minus, and overall, the pluses are just barely enough to make up for the minuses as long as I'm not traveling.

But I now know what I'd really like to see as a "flagship" device. I'd like to see the "flagship" features (e.g., huge screen, NFC, waterproofing, nice camera, micro sd card, headphone jack - especially with the Quad LDAC that LG put in), but paired with a power-efficient chipset like the Snapdragon 625 and a lower resolution screen to save battery power. Now that would be a phone worth paying real money for. But of course, no such phone exists, and it doesn't look like any of the Android vendors will have the courage required to make such a radical move in the near future --- they're too busy chasing Apple. Which is a real pity, because again, a phone with a dead battery is a phone with zero features, which is what I see all too frequently with this phone.

Friday, October 30, 2015

Review: Lightroom 6

I actually upgraded to Lightroom 6 a while back, but waited until I did a couple of trips with large amounts of photography before writing a review to get a realistic view of what it does, and which features I turned out to use a lot, and which I thought I bought it for but didn't end up using.

I'd skipped Lightroom 5, mostly because it included zero features that I thought were useful to me. Lightroom 6, however, featured several features that I thought were potentially ground-breaking:
  • Photo-Merge: including merge to Panorama and merge to HDR. I hadn't been experimenting with HDR, but prior to Lightroom 6, I was using Microsoft ICE to stitch images. It got to the point where I used a pre-set to automatically export and merge using Microsoft ICE via the command-line. The benefit of Lightroom doing it all internally is that you end up with a RAW merged file, which means that you can use ND grad filters and other tools uniformly across the final image. This is huge! Suffice to say that I would have paid the upgrade price (albeit reduced because of an employee discount) just for this feature alone.
  • Face recognition. I gave up tagging all my kids photos manually because it was too much work. It'd be nice for this to be fully automated.
  • Performance. While my i7 920 is still faster than most machines out there (very impressive given its age --- but mostly an indictment of how laptops have taken over the world), I also have a high end GPU sitting in the machine that's just begging to be used. Lightroom 6 promised to make use of this otherwise idle silicon. More performance is always good!
So in practice, how did these features fare? Face recognition was an obvious bust. Turn it on, and let your machine chug for a day, and come back and discover it's still not done. The face-recognition software seems to be single-threaded, and doesn't make full use of the CPU or GPU.

GPU acceleration was also disappointing. First of all, it crashes a lot on the 7870. I finally found some article on the internet on how to configure the driver so Lightroom stopped crashing. However, I'm not sure I noticed any performance difference: I'm guessing my machine was already fast enough, and the acceleration didn't do much for the batch jobs I use (bulk-export, import of photos). Where I thought it might help a lot would be on my wife's Surface Pro, which didn't have quite enough CPU power so Lightroom was frequently laggy, but in practice, I didn't notice much difference there either.

Photo-Merge, however, paid for the upgrade all by itself. I found myself using it a lot, and even better, the UI is designed right. You select a few pictures and hit the Merge button. The machine chugs for a bit and delivers you a preview. If you like the preview, hit the "Merge" button, and the merge happens in the background, using spare cycles while you go on to do other editing tasks! This is pretty amazing. The resulting merged image was frequently too large for Facebook (not a surprise) and also taxed the Surface Pro to the limits when loaded into RAM. But that's what I want. The same image on my desktop took appreciably no extra time to load and was subject to all the manipulation I wanted.

There are other nits in the UI that have carried over from previous versions of Lightroom (for instance, when you shell out into Photoshop to do some editing, it creates a second copy of the picture but doesn't place it next to the original for easy selection/culling). But by and large, I'm happy with this upgrade. If you don't already use Lightroom, moreover, and you want to be a serious photographer, there's really no other tool out there that does what Lightroom does (believe me, I've looked). There's good reason why many photographers go to the trouble of building a machine just so Lightroom flies. It's too indispensable a part of a serious photographer's workflow to forgo. If you trouble yourself with any camera other than your smartphone, then you owe it yourself to spend a fraction of that camera's budget on software to get the most out of it. Recommended.

Friday, May 22, 2015

Review: Garmin Vivoactive

Last year, I bought, tested, and returned the Garmin Vivofit.  While it was a reasonable device for the people who are looking to start exercising or otherwise return to fitness, it wasn't a suitable product for someone like me. When prices dropped this year, however, I bought 2 for my parents to replace the pedometers that had been very flakey for them. The Vivofit was an ideal product for them: it didn't need to be recharged, and sync'ing to their PC was a matter of pushing a button and waiting while Garmin Express picked up and uploaded to the cloud. Even better, they didn't need to sync more than once every week or so, and the product would survive being worn 24x7, so they didn't get a chance to forget to put it on. It would even survive being washed in the laundry!

While buying the Vivofits, however, I noticed that Garmin had launched the Vivoactive, a product much more suited for someone like me. Since my brother had a birthday incoming, I bought him one, despite his skepticism. (My brother is an Apple iPhone user, and a first-round Kickstarter backer of the first generation Pebble Smartwatch) Upon receiving the Vivoactive, he was so positive about it that he'd asked my other brother for a full suite of Garmin Bike Sensors for his birthday. It immediately replaced his Pebble, and got him to track his cycling and steps/day as well. (He once had an Edge 305, but never replaced it once the battery died)

With that level of enthusiasm, I bought one for myself with the help of a Best Buy coupon. If you know me, I'm as cheap as they come. When I told one of my friends that he was as cheap as I was, he said he didn't know whether he should have felt complimented or offended. To get an idea of why the Vivoactive is such a good value, consider that it converges/replaces the following products all at once:
  • Garmin Swim ($150): stroke tracking, lap counting, swim timer
  • Garmin Vivosmart ($150): Step tracking, sleep tracking, smartphone notifications, ant+ bike sensor and hrm pairing, VIRB action camera control, auto-sync
  • Garmin Edge 200 ($130): Cycling GPS (no barometer, no sensor pairing)
  • Garmin Approach S2 ($190): Golf GPS. I'm not a golfer, so no comment.
  • Garmin Forerunner 220 ($200): Running GPS with foot pod pairing and accelerometer for indoor training.
No normal human will make use of all the features of the Vivoactive (I don't know anyone in the intersection of Golf+Cycling), but if you do 2 of the above activities, the Vivoactive will provide more than sufficient coverage. Since I swim twice a week, cycle 4-6 days a week, and hike about once a week or so, I'd extract quite a bit of value out of it. Against that is that I already have an Edge 800 which works pretty well. The Edge also has a barometer, which makes it much more accurate for measuring elevation and cycling worthy bragging features like elevation gained during a ride, and the current gradient, though latter bonks at grades much steeper than 16%, making it worthless for the truly brag-worthy rides.

So how does the Vivoactive work out in real life? The first feature you notice when you power it on is the always-on watch display. If you're lifelong watch-wearer, then this wouldn't seem like a big deal. But I hadn't worn a watch since I was 21, and the first time I saw someone wearing an Apple Watch I thought it was broken or the battery had run down because the screen was blank. It wasn't until the person stooped to pick up something and the display flashed on that I realized that it was a power saving feature to blank the screen. I'm happy to say that the Vivoactive serves as a watch just fine, with a white-on-black default display for time, date, and current charge status. It's not flashy and doesn't call attention to itself, but it's thin and robust, and you don't have to use an exaggerated motion of the wrist in order to tell time.

One interesting thing about sync'ing it to my PC is that my version of Garmin Express was old, and hadn't updated itself (I didn't realize that it didn't do that). When I plugged in my Vivoactive, it got confused and led me down a garden path trying to sync with it until I realized the problem and upgraded it. After that it was a snap, downloading and installing new firmware onto my Vivoactive quickly and easily.

The Vivoactive came charged to 92%, so I immediately went and took it for a ride, pairing it with my bike's sensors, heart-rate monitor and running an Edge 800 in parallel, so I could see the results. Here's the Vivoactive track, and here's the Edge 800 track. You can see that with the exception of elevation data, both tracks are essentially indistinguishable from each other. What you can't see, is that the Vivoactive was much faster at satellite lockon and booting up than the Edge 800! Brad Silverberg had raved to me on Facebook about how quickly GLONASS+GPS locked on, and I hadn't realized how quick it was until I did the back-to-back comparison against the Garmin Edge. Let's just say that while I could keep the Edge 800 confused for half a minute by cycling quickly during the boot up phase, I could not keep the Vivoactive confused for even 5 seconds. Even more importantly, because the Vivoactive is an "always on" watch, there's no boot up period! Even before you can select the "Bike" function and push the start button, the GPS function has already turned on and satellite tracking has started!

Even more importantly, the display, albeit small compared to the Edge, was crisper, brighter, and more readable in direct sunlight! It beats the Edge 800 by a mile in that regard. In fact, I'll go as far as to say that it beats my 2-year old Basic Kindle, which of course outperformed any color screen in daylight until the Vivoactive came along. I don't have a Paperwhite to compare it with, unfortunately. By the way, you'll read about how difficult to read the display is indoors on Amazon reviews. I call bollocks. It's actually far easier than any of the traditional watches I've ever seen!

If you turn on GPS+GLONASS, your battery life isn't going to be anywhere the 10 hours claimed by Garmin for GPS tracking. But overall, the Vivoactive more than holds its own against my Edge, with the exception of elevation, where it's within about 5%.

Next-up, hiking/walking. I'll note that if you have an Edge unit, you can buy a (relatively) cheap wrist-strap, stick your Edge on it, and use it to track your hikes. I've done that in the past for hikes, but it's not as satisfactory in that you don't get pace data, nor do you get the step counter functionality (which runs in the background!). Again, with GLONASS+GPS, you can even see where I cheated and cut across the parking lot at the end of the hike. The Vivoactive is also much more comfortable to wear on your wrist than any of the Garmin Edge units, which are thick, bulky, and aren't really intended to be worn so you have to tilt your head a bit to read them.

Swimming: I did a swimming workout and discovered to my disappointment that the device doesn't actually attempt to figure out what swim stroke you're using, which Garmin Swim does. What it did do a good job with, however, is to provide a stroke count, time per lap, and lap count. (The latter is useful because swimming is so boring that I swim with headphones and music, and occasionally would lose count and forget to switch to the next segment of my workout) Reviewing the data from the session, I could clearly see the kick-board laps, and it was fairly easy to see when I was using the crawl vs the backstroke, breast-stroke, etc. So while stroke detection would have been nice (which to be fair Garmin Swim does do), it's not necessary. If I was a more serious swimmer I'd try to improve my times, etc. But much like the Vivoactive, I'm a jack-of-all-trades and master of none.

Now, Open-Water Swim is explicitly not a feature of the Vivoactive, but I tried it anyway, since I was in an outdoor pool by using walking mode. The results are as you might expect: with only intermittent GPS pickup (yes, I had GLONASS on as well), the track jumps all over the place and the errors are huge. On the other hand, it's better than nothing, and you do get what seems to be a reasonable mileage reading at the end. Note that while Swim mode turns off the touch screen, walking mode doesn't, and that can cause weird things to happen due to water splashes. On the other hand, since the start/stop button is a physical one, you can't accidentally lose data due to the water splashes.

As far as smart-watch notifications are concerned, they're actually surprisingly useful, especially when cycling. I'm used to ignoring my smartphone's various noises while cycling, though I do stop to take calls when I'm not wearing a bluetooth headset. It's very nice to see e-mails/texts flow through to the watch, glance at them, and then let them disappear, knowing that it's nothing urgent. This is one feature that's surprisingly useful whether you're driving, cycling, or even in the middle of the hike. What I did not test is the music control functionality. That's because when hiking, I use a bluetooth headset which has physical buttons for controls, and that's just going to be better than any touch screen. The same applies while driving. While you can click through on a notification and read the e-mail or text message, you can't reply on the Vivosmart. For that, you'd have to pull out the phone, which I think is a perfectly acceptable approach.

As an activity tracker,  the device works as well as the Vivofit. What's nice is the automatic sync'ing via smartphone. Of course, this leaves me with a dilemna, since the Edge 800's data is definitive, but doesn't get sync'd more than once a week, while the Vivofit's data is always up to date, but has suspect elevation data. Given the convenience, however, I am very tempted to use the Vivofit's data and just not ever sync my Edge 800, using it as an on-board display, and an odometer for each bike, which is something the Vivoactive doesn't do. (Neither does the follow-on Edge 810, for that matter!) The Vivoactive (like the Vivofit, Vivofit 2, and the Vivosmart), nags you every hour to walk about 100 steps or so in order to stay active. The vibration is subtle and not aggressive, but it's there and the red-bar is very much guilt-inducing, so if you tend to sit a lot (and what Software Engineer doesn't), that's a good feature. I was previously using Moves, and running both the Vivosmart and the app confirms what I've long suspected: Moves systematically under-counts steps and miles cycled. Since Moves got bought by Facebook, the app has not been updated and I suspect the server-side applications will probably be killed before long, just like Friendfeed was. The competing app Google Fit gets crappy reviews even from ex-Googlers, explaining why FitBit, for instance has been so successful that it will soon file for an IPO.

As a sleep tracker, the data is nice, but I'm not sure what to do with it. In combination with my CPAP machine, however, I now have more data than I know what to do with. The only thing I'm missing now is an oxymeter, which has already been proven to be remarkably worthless for someone like me.

The battery life is acceptable. With 3 hours of hiking, 2 hours of cycling (all with GLONASS+GPS on), an hour of swimming, and 2 days of sleep tracking and activity tracking, the battery was down to 40% after 2 days. I expect the battery life to be better with GLONASS off. The battery charges from 40% to 99% in about 90 minutes from a computer's USB port.

As far as comparison with other devices are concerned, the obvious one is the Garmin Fenix 3. This offers most of the features of the Vivoactive, plus Openwater Swim, Triathlon mode, a real barometer, a compass, climbing mode, and a skiing/snowboarding mode. Of course, at twice the price ($500, $600 for the super-tough sapphire version), you literally pay for it. It's also much thicker, heavier, and bulkier. While the barometer is nice, elevation data isn't very accurate if the unit's thermometer is next to your skin, since to get correct elevation you need accurate air temperature. So if you want a reliable temperature you need to also pick up the Tempe sensor, which is also compatible with the Vivoactive. From my perspective, I think Garmin lost an opportunity by not selling an external barometer/temperature sensor for the Vivoactive.

The inevitable comparison  is with the Apple Watch. Here in Silicon Valley, I've already seen many people walking around with those blank screens attached to their wrists. But that's a function of Apple marketing much more than anything else. If the two products had their parent companies swapped, I'm pretty sure the features of the Vivoactive would be touted as revolutionary (week-long battery life, thinner, swim tracking, 50m water resistance, GPS+GLONASS that doesn't depend on your phone) while folks would be making fun of the Apple Watch (wearing a thicker blank screen? Having to charge every day so it can't even do sleep tracking?). But that is what it is. Despite next to zero marketing from Garmin, my local REI (in Silicon Valley!) reports that their black Vivoactives (sans heart rate monitors) sell out as soon as they come in, and that they only have the white ones in stock. So it does seem that the outdoors people do understand and value the product, even while the tech press (and the outdoor press, as far as I can tell!) has basically ignored the Vivoactive.

Regardless, the Vivoactive comes highly recommended. The value and functionality this product represents are pretty much unbeatable. Even if you're not a watch wearer (I wasn't), you might consider this product as worthy of possibly changing your mind. As mentioned above, if you use any 2 of the functions the Vivoactive supports regularly, you'll get your money's worth (and then some --- there's significant value to convergence into a package that's smaller than every one of the products the Vivoactive replaces). The only folks I can think of who would be unhappy with the product are the ones who use Windows Phones and hence lose out on the smartwatch features, but even for those folks, having to sync the device manually through a PC might be worth the trouble if they're regular swimmers.

Further reading: DC Rainmaker's In-Depth Review

Wednesday, February 11, 2015

Interviewing and Performance

Recently, someone asked me a deep question: we all know (and Google has the data) that interviews do a poor job of predicting on-the-job performance. If that is the case, would a different form of interviewing (say, pair programming) or other form of testing do a better job?

My answer to that question is "no." What most of the other articles do not note is that Google actually does have data as to the major factor influencing on-the-job performance (at least, performance as viewed by Google's notorious promotion committees). It turns out that even in 2003-2004, Google had data indicating that your first tech lead at the company strongly predicted how well you would do in the future inside Google.

There are several reasons for this. One obvious one is that the promotion committee is likely to weigh your tech lead's comments on your performance more heavily than some other random engineer's. The deeper reason, however, can be found in the book, Chasing Stars. Fundamentally, all organizations have stated or unstated rules for how they work. Whether the on-boarding systems do a good job of explaining that to new employees and indoctrinating them in the culture very much explains future performance.

Google at the time when I joined was a largely oral culture. A typical noogler joining the engineering team would during his first week of working through the engineering training document find several bugs a day in the documentation necessitating a documentation change, if he were conscientious. Old documentation or out of date documentation was rampant, and the tech docs team had their hands full trying to keep up with the amount of code and internal APIs continually being churned. If you actually had to get work done, your most important tool wasn't the documentation or the internal search engine (which was laughably bad), but knowing who to talk to. For instance, if you needed to make a change to crawl, and your tech lead knew to say, "Go talk to Arup Mukherjee and ask him how you would do this", you were in luck and you'd be productive and efficient. If your tech lead said, "Go read the documentation," or worse, "Use the Source, Luke", not only would you waste a lot of time reading both code and documentation (as I was forced to once when dealing with the search results mixer), chances are when you were done you would probably have done it wrong, and your code reviewer would spend gobs of time correcting the way you did things, and forcing you to do everything over and over until you got it right. If that happened, you might as well kiss your "Exceeds Expectations" performance review goodbye. (And yes, I lucked into knowing people who wouldn't just tell me who to talk to, but walked me to their cube, provided introductions, and made it clear that what I was supposed to do was important enough to deserve help)

I'm fond of saying that context matters a lot when it comes to performance. This type of context-sensitive performance isn't necessarily because the tech lead deliberately led the poor engineer wrong. It was because the tech lead did not provide a suitable context for the engineer to work with, and in the process makes the job much much harder (or in some cases nearly impossible) for the new engineer. Hence if your interview process is successful in eliminating people who can't actually do the job, but you end up with variable performance or unexpectedly poor performance on the job from people who should be doing well, you need to examine your on-boarding process or the training process for your leads/managers.

The follow up to this question then is, "If performance is so context determined, why do we bother with interviews?" The answer to that is that the goal of the interview isn't to predict performance in the future. The goal of the interview is to ensure sufficient technical competency and cultural compatibility so that with a good on-boarding process or a decent tech lead/manager, the new engineer ought to be able to do a great job. Hence, when I run interviews, I don't favor esoteric problems that require dynamic programming (for instance), but basic data structure questions. While I consider basic tests such as the Fizz Buzz Test way too simple and insufficiently indicative of someone with basic computer science knowledge, coding questions that approximate that level of complexity (while still testing basic computer science concepts) is all that is typically needed to weed out people who simply can't code and shouldn't be allowed access to your source control system.

Tuesday, February 10, 2015

Review: Windows Server Essentials 2012 R2

My Windows Home Server (original) has been running very nicely since it was first installed in 2011. The nightly backup feature has saved my ass countless times, and I've definitely done many a restore from image instances from backup when I would try a new OS (such as Windows 10 Technical Preview) and then backed out from it. It also serves as the central file server for the home, exporting music/video/etc via DLNA to the living room PS3. All in all it's been an incredible value.

Unfortunately, WHS is showing it's age, and has been de-supported. This was running fine for us until very recently, since two of the limitations were no big deal:

  1. No support for larger than 2TB disks (6TB of local storage was plenty, thank you)
  2. No support for UEFI machines (we didn't have any, except for various Windows tablets, which didn't store any data worth backing up)
Then we got a Surface Pro, which is a UEFI machine. The WHS had also started getting a little flakey, and I'd eventually have to replace my desktop and laptop with newer, UEFI machines so I started shopping. 

One obvious upgrade was to go to Windows Home Server 2011, but that's also showing it's age, and lost the drive extender feature of the original WHS. We could go with hardware RAID by as explained on the Windows Home Server review, RAID is a mixed bag, since if the RAID controller fails, you'd have to replace it with an identical piece of hardware or risk losing your data. RAID isn't a great solution for home use.

Windows 8.1 has support for storage spaces, which has many interesting features that in many ways mirror what was in file extender:
  1. Drop any set of storage spaces disks into any Windows 8.1 PC, and you'll be able to crack open and view all the files. This sort of commodity hardware data storage is invaluable.
  2. You can mix and match drive sizes into a storage pool, and then create virtual disks that can be mirrored (RAID 1), striped (RAID 0), or with parity (RAID 5) across the multiple drives. This is very nice, since you can then upgrade storage slowly. You can even designate spare disks and "hot spares", to automate failover. This effectively lets you tell Windows to have different data policies for different type of data, for instance asking for photos to be mirrored while videos are striped.
  3. You can even thin-provision virtual disks, and have Windows warn you when you need to add storage.
The problem with Windows 8.1 is that it doesn't support full bare metal backup and restores. For that feature you need Windows Server Essentials 2012 R2, a full on business server. This is when you realize what a great deal the original Windows Home Server was, since the retail cost of Windows Server Essentials 2012 R2 is more than what I paid for the original Windows Home Server hardware!

No big deal. I worked in the software industry, so I pulled some strings and got a copy at employee pricing. Then I needed hardware to run it. I thought about repurposing the Windows Home Server, but then realized it was a bad idea: I needed to do a server-to-server copy, and the Acer was also headless. While it's possible to do a Server install headless, it's not for the faint of heart.

The cheapest server you can buy is the Lenovo TS140. You can get it for around $225 for an i3 machine, 4GB RAM, and no HDD. However, I found a deal where for well under $300, I got a quad core Xeon configured with ECC RAM. Now that I was expecting to handle tens of terabytes, I figured ECC RAM was worth paying for. The extra CPU is also helpful for running server applications such as Plex, which basically transcode video sources on demand for targeted delivery.

Installing WSE 2012 R2 is straightforward. However, I learned a few things about UEFI machines and WSE 2012 R2 that weren't documented elsewhere that I looked:
  1. OS updates will not work if you have secure boot turned on. Turn off secure boot.
  2. WSE 2012R2 (or any version of Windows) will not allow you to use excess space on the boot drive as part of a storage spaces cluster. So effectively, your 4 drive bay server is now a 3 bay server! This isn't a big deal if you have 4TB disks in the server, since that's plenty of storage, but it does make the server smaller than you expected.
  3. WSE 2012R2 is a business server. So the first thing it does is it sets up a domain. This is no big deal as it's very automated and easy. However when you connect an existing computer to a WSE server, the first thing it does is to register the new machine to the server, with the server providing the domain. This is no big deal with Windows 7: your login prompt changes so you have to hit control-alt-delete to login, but there's no difference otherwise. With Windows 8, if you previously logged in using a Microsoft cloud account, then if you login using domains, you lose all the cloud customizations you used to have! The solution to this is to hack the registry on the Windows 8.1 machine to skip joining the domain. If you were truly running a small business like a dental office, this isn't what you want (you wouldn't want your employees logging on with their cloud accounts), but for a home user who's upgrading from WHS, this is the right thing.
Other than that, everything was fine. I ran a backup and bare metal restore on my Lenovo X201, and things went smoothly. Then I tried it on the Surface Pro. The Surface Pro has 2 SNAFUs. First, bare metal restore doesn't work over WiFi (not surprising). This was easily resolved, since a USB ethernet port was already available for high speed network connectivity. Since you also need to plug in a USB thumb drive for booting, you'll need a USB hub. There are various warnings that you need a powered hub, but my unpowered hub was fine provided I only had the thumb drive and the ethernet port hooked up to it. The final SNAFU was when I tried booting using the thumbdrive and got an error. This one turned out to be secure boot's fault. Turning off secure boot on the Surface Pro got the bare metal restore working with zero hitches.

I transferred all the data over from the old WHS server. It turned out that the old server was CPU constrained. While the old server topped out at around 45MB/s, the new one peaked at 65MB/s. This is pretty sweet. I could also run the Plex Server on it without the CPU even breaking a sweat.

The nasty thing about storage spaces, however, is that it doesn't auto rebalance when you add a new drive. You can force a rebalance, however, by creating a new virtual drive after adding a new drive that makes use of the new drive, and then copying old data to it and then deleting the old virtual drive. This is kinda more futzy than I'd hoped, given that the old WHS kept ticking for years on end without me having to do manual rebalancing and all that, but again, if your old server failed, you could move the drives over to a Windows 8 PC and everything would just work, so this is a feature I'm willing to live with.

All in all, would I recommend this? If you have an old WHS that's been operating and you don't have any UEFI machines, I'd recommend sticking with the old server for as long as you can stand it and don't need to upgrade to any UEFI machines. If you have an existing old-style NAS RAID, I think the Windows solution is superior to any of the freeware servers, especially since ZFS requires gobs of RAM, and the low end servers are cheap, assuming you can snag a copy of Windows Server Essentials 2012 R2 at employee/educational institution pricing. The combination of having a file server, bare metal backup restore, file history, etc, and DLNA server is pretty sweet.

If you ask around, most people (even software engineers who should know better) rely on cloud storage or don't do backups of their data. If you shoot with a modern digital camera (as opposed to a crappy phone camera), videos and photos quickly fill up terabytes of data, making cloud storage prohibitively expensive. Under such circumstances, a home server that backs up all your computer is well worth the cost, and WSE is surprisingly efficient and easy to use. Recommended.

Monday, February 09, 2015

Review: Outlook 2013

As documented previously, I've recently run into quota issues for gmail. Along with the need to aggressively delete e-mail to get under the quota, I've finally decided to take backing up my e-mail seriously. This is important, because together with photos, a lot of e-mail is actually useful for searching and remembering details that aren't recorded any other way. Even if you've decided to pay for storage on Gmail, for instance, there's always the chance that your account gets hijacked phished or otherwise deleted/hacked, so having a backed up archive locally protects you in that event.

As a well documented cheapskate, I first tried the free solutions. Mozilla Thunderbird, for instance, is well known and popularly acclaimed. But it was too slow and couldn't really manage huge inboxes. I tried various other solutions before giving up and acquiring an Outlook 2013 license through an employee purchase program at Microsoft.

Setting up Outlook is fairly straightforward, and it auto-configures now for Gmail accounts as well as Hotmail and Yahoo mail. For Google Apps for your domain accounts, however, you have to go through custom configuration. There are a few bugs there, but eventually I got it so that everything sync'd.

Performance is decent. It can't match the search index that's built on the browser based version of gmail, but it's acceptably fast and works even when you're offline. The real feature that made me pay for a license, however, is the auto-archive feature. This essentially lets you move old e-mail into an archive which you can then access and search separately. I did an archive and the machine went away for a few minutes and created a 9GB archive of my e-mail all the way back from 2004. It's searchable, opening the folder isn't fast, but it works and I can apply filters to it and search it after creating an index, which takes forever.

The biggest pain point is that I have to force myself to keep Outlook running. (It's not a hog: 200MB of RAM is all it takes --- keeping a Chrome window open to Gmail costs quite a bit more!) The sync with a live server is also somewhat slow: I frequently get a ping on my browser window a few minutes before Outlook fetches the mail. What's also interesting is that while it automagically imports your calendar (and it does a great job of that), it doesn't automatically import contacts, and the auto-complete does not auto-populate.

What auto-archive does not do, unfortunately, is to remove archived e-mail from the IMAP server (in this case GMail). This isn't great, but on the other hand, means I can now very aggressively delete e-mail in the future.

There's a market opportunity somewhere for an e-mail app that doubles as a backup for cloud-based storage, but I'm sure people like me are a rarity (most people don't even backup their photos). However, just like Open Office Spreadsheet was never a good substitute for Excel, there's no serious alternative to Outlook if you need the capability of offline e-mail or archives.

Recommended.

Monday, April 28, 2014

Review: Android Studio

Since helping my wife with her Nutrition Tracker App, I'd had a chance to try both Eclipse and Android Studio for android app development. Both of them run on Windows, my preferred platform, but it didn't take 3 days with Eclipse before I got frustrated with frequent crashing, features not working, and a lousy layout tool. I found Android Studio and downloaded it and soon persuaded my wife to switch to it.

Android Studio is based on IntelliJ IDEA. Back at Google when I was doing Java work, I avoided it like the plague, preferring to stick with Emacs and gtags. That's because Google's Java source base was so big you couldn't possibly load it into IntelliJ on the puny workstations of that time (yes, Google only supplied machines with 2GB of RAM in 2003), and even if it had been possible, those machines would have slowed to a crawl under the load of processing that much code. IntelliJ/Eclipse die-hards were resorting to wacko tricks like subsetting Google's code base so it could load into IntelliJ and then writing plugins into gtags for accessing the rest of the source code. I have no idea what Googlers do today, but my suspicion is that things haven't gotten much better.

For small Android projects like Nutrition Tracker, however, an IntelliJ is just about right. If you're unfamiliar with the Android API, it would supply you with method name completion, tell you which arguments to supply in which order, automagically add imports, allow for automatic refactoring tricks such as moving methods, renaming variables safely, and moving inner classes out of their outer classes, shifting classes between packages, etc. The layout tool helps you avoid having to learn the lame layout XML language, so you can actually try to make things work (as opposed to making things look pretty and usable --- I think Emacs is a great UI, so I have no expertise on those topics).

Android Studio is slow. It's slow to startup, it's slow to compile, and it's slow to run the debugger. A typical edit-compile-debug cycle would take around 10-20 seconds in order to build a tiny app. Note that I'm not complaining about Android Studio's massive use of resources while I'm editing. I think that's entirely appropriate. I want all my 4 cores/8 threads to be utilized in order to make my coding experience faster and more pleasant. I don't even mind the startup, since it doesn't need to happen that frequently, and it's a one time cost. But the Gradle build system is not only a resource hog, but it introduces additional latency into my think-time, so I begrudge every second it's spending traversing dependency graphs instead of actually compiling code. I have no idea why the Android Studio engineers chose a clunky system like Gradle, as opposed to rolling their own and integrating it fully into the IDE. I never want to edit the gradle build files manually, but the system forces me to. What's more, the syntax is really obscure and the documentation is inadequate.

For instance, when doing an android release, the documentation only covers Eclipse. Worse, the documentation lies to you. It tells you to modify your Manifest file, and I did. Until I kept scratching my head as to why that never worked. It turned out that you had to modify the Gradle config, since the Android Manifest XML file was ignored in the presence of Android Studio. Well, that took so much googling around that I can't remember what search term I used to uncover the Stack Overflow answer any more.

The source control integration is also funky. It supports Git, Mercury, and Subversion, but not Perforce. Given that Google uses Perforce internally, I surmise that Google's internal projects do not use Android Studio. This does not bode well, since that will mean that Android Studio's most serious problems (build performance) will most likely never get addressed because its non-existent internal customers will not feel the pain.

For quick and dirty Android projects, Android Studio is probably the best there is. If you're serious about building an Android app, however, my suggestion is that you use Emacs and roll your own build system that's decently fast. Otherwise, the benefits from using an IDE will be swamped by inordinately long compile/edit/debug cycle times. Note that though my machine is old, it's still decently powerful compared to even the fastest rig today, let alone the kind of laptops most "hip" developers favor, so it's unlikely you can solve Android Studio's problems by throwing more hardware at it.

Recommended only for non serious Android projects. It's a great tool for getting started quickly though, so use it to bootstrap yourself into doing Android development if that's one of your goals.

Friday, April 25, 2014

Why it now makes sense to build your own PC

I've always been tempted to build my own PC. I'm no stranger to the tools, since my internship at Geoworks effectively required me to take apart and put together the machine I was given at work. But until recently it made no sense. Machines were increasing in performance significantly, so every 2-3 years it made sense to get a new machine. When you're getting new machines that frequently, it doesn't make sense to build your own, since the beige box vendors can get you much lower prices, and the cost of your time to swap motherboards, CPU sockets, etc., in and out would swamp the ability to bring over your hard drives, etc. Given Moore's law, every 2-3 years you'd have to buy all new hardware anyway!

I recently took a look to see if it was worth replacing my 5 year old desktop. To my surprise, the answer was "no." Looking at the CPU benchmarks, it looks like a "modern" i7-4770 would clock in at less than twice the performance of my 5 year old i7-920. In the old days, 5 years would have been enough to get at least a quadrupling of performance. Not even getting a doubling in 5 years would have been unthinkable. Part of it is that Intel's no longer getting any competition from AMD. Part of it is because getting up past about 4GHz would overheat a PC, so the easy way out of just merely increasing clock speed is out. Increasing the number of cores have already hit diminishing returns as far as most PC users are concerned (I'm an exception: I regularly process video).

The flip side of this is that the base operating system hasn't been using more hardware resources recently. Windows 8 is actually less resource hungry than Windows 7, which would have been unthinkable in the old days. Thanks to Microsoft's desire to compete in the tablets space with Apple and Google, Windows 8 actually runs decently on a tablet with just 2GB of RAM. This gave me the courage to replace my wife's 4-year old X201 with a Microsoft Surface Pro with half the RAM. My wife didn't even notice the missing RAM, despite running the resource hungry Android Studio, which is enough to spin my desktop PC's fan up.

This has several implications for users and developers:

  1. Rather than buy a mid-range machine and planning to replace it every few years, it might be cheaper to build a high end machine and upgrade components. Given that CPUs and motherboards are no longer going to have to be trashed every few years, you might as well get a chassis that supports easy hard drive and SSD replacements/expansions, and GPU upgrades, if you will run GPU-intensive activities.
  2. I/O standards do make a big difference, but any PC with a free slot will let you upgrade to USB 3 and other standards, so again, expand-ability might be more important than "planning to throw it away."
  3. An adequately high end machine will probably last a good 10 years in this environment (i.e., a i7 4770k wouldn't be obsolete for 10 years), which means that it makes sense to put money into a high quality power supply, since the higher quality power supply would provide cost savings when you plan to run a machine for that long. This is in contrast to the "buy-and-replace" strategy, where spending $20 more on a better power supply wouldn't pay for itself in power savings.
  4. This also seems to be applying to laptops, though laptops do benefit from the power efficiency gains of the latest processors, so if battery life matters to you, an upgrade every 4-5 years might make sense. The way most people seem to use laptops (constantly plugged in and never actually used as a mobile device), I think most people should replace laptops every 10 years, getting new batteries every 3-4 years or so, assuming that their device supports it.
I never thought I'd see the day when PCs would be expected to last as long as cars, but then again, I never thought I'd see the day when Microsoft would roll out huge new products and initiatives and everybody would just yawn. But yeah, my next PC is going to be something I build from the case in, and I'd be planning for it to last a good 10 years, something I did not expect when buying my previous desktop. I would have taken a completely different approach otherwise.

Wednesday, April 16, 2014

First Impressions: Microsoft Surface Pro

Our trusty X201 had been getting long in the tooth, and Xiaoqin decided to try some Android development. If you've ever tried Android Studio, you'll know that it's a CPU intensive process since it's based on IDEA IntelliJ. The build system associated with Android Studio, Gradle, is also a massive CPU hog, and introduces no small amount of latency to the process. I never thought I'd miss the days of Visual Studio, but it's quite clear that compared to Microsoft's tool set for development, Android is quite a bit behind, and extremely unstable. Of course, in terms of market share, the positions are exactly reversed.

After trying out a Surface Pro in the store a year or so back, I vowed never to buy a conventional laptop again if I could buy a Surface Pro-type device. Fortunately, Microsoft was having a sale on refurbished Surface Pros, so I was able to pick up a 128GB model for $450. You can find them for about $500 if you're willing to put up with a 64GB model. With USB 3 and a microSD card slot, it's probably no big deal if you can't find the 128GB model.

As a laptop, it's quite impressive. It's about 50% faster than the older X201, and 3X faster on boot up, hibernation, and recovery from hibernation, with boot times going from 30s to 10s. And yes, this is with the X201 upgraded to an SSD. There are a few strange fit and finish issues, such as the mini display port slot not being very deep, so when inserting a standard cable there's a little bit of chrome sticking out. The tablet comes with a pen, but there's no place to put it except in the magnetic charging port, and the magnetic charging port isn't strong enough to retain the stylus without loss if there's any pressure whatsoever on it. Since this is an expensive Watcom digitizer stylus, you really do want to keep track of it!

Running Lightroom is fast as you might expect, with no hitches and the Surface Pro had no problem driving the 27" HP monitor with a 2560x1440 display. One nice mode you can run is to have the touch screen run the Start screen, while the big display runs the desktop. This gives you a nice touch UI for the touch part, while having the desktop to do real work. Of course, Microsoft had to glitch this up---in this mode, desktop apps still launch onto the small screen instead of automatically selecting the big screen. It's this kind of inattention to detail that gives Apple its edge over Microsoft, though I've found Macs to have their share of problems when using multiple screens.

The device has a fixed, 4GB of RAM, but surprisingly, until I told Xiaoqin about it, she didn't even notice it didn't have as much RAM as her old device. At least part of the reason is that Windows 8 Pro actually consumes fewer hardware resources that Windows 7 did. The other part of it is that in recent years, software developers just haven't been able to assume more than 4GB of RAM anyway, so as long as you're single tasking or running just one web browser and an application, you're generally OK.

As a tablet, the Surface Pro is quite hefty, though not as hefty as the X201. It makes up for that, however, with power. I'd already written about how much faster the Dell Venue 8 Pro is than my Nexus 7. Using the Surface Pro is instantaneous. The Type Cover is also a joy to use, giving you keyboarding performance akin to what I'm used to with the X201.

The real revelation, however, is the stylus. I'd never tried any of the previous PCs in tablet mode, other than my use of the Wacom Bamboo tablet for producing Independent Cycle Touring. But while I hadn't noticed, Windows' handwriting recognition has become nothing short of amazing. My handwriting can compete with any doctors' for sheer inscrutability, but the Surface Pro handled my cursive with aplomb, as long as I was writing common English words. Write something not in the dictionary, and just like any other machine translation program, and you end up with gibberish. There was no training period, however, and I could pick it up and use it. You could even turn on Chinese handwriting recognition, though Xiaoqin pointed out that Pinyin is faster and much easier to use with a real keyboard. Unfortunately, having multiple languages on the machine is problematic if you use a keyboard, since Microsoft used Windows-Space to switch between languages, and Xiaoqin found it far too easy to hit that combination by mistake. In past versions of windows we tried to change the language key bindings but to no avail, so we gave up and uninstalled the language pack instead.

All tablets are compromises. The Surface Pro does not have great battery life. 3-4 hours with Android Studio and that's it for the battery. When fully powering Android Studio, the device also gets hot enough to turn on its fan, which sounds like a low hissing noise. It's quieter than the X201, but still noticeable if the room is otherwise quiet. Next to my Core i7 920 box going full bore, of course, it might as well not make any noise. At no point would you burn your hand grabbing the Surface Pro, however, so there aren't any safety issues.

Long term, the biggest concern about the Surface Pro is the battery. With the machine running hot, and the battery fully charged most of the time in desktop mode, I would be surprised to see more than 3 hours of battery run time after the first year, and 2 after the second year. Most laptop batteries get abused this way as well, but the Surface Pro has a non user-serviceable battery, with the only option being the $200 power type cover. Fortunately, for the price (which is much less than what I paid for the X201 way back when), we can treat the Surface Pro as a disposable computing device. This is much more of a concern nowadays, however, than it would have been 10 years ago. 10 years ago, you'd expect to replace a machine every 3 years. Now, an adequate machine (which the Surface Pro most definitely is) could have a potential life time of 5-6 years. At the rate Intel is improving (or not improving) CPU performance, I'm likely to keep my desktop for another 2-3 years at least!

There are a few accessories that I would recommend for the Surface Pro. The first is a Type Cover. We tried both the Touch Cover and the Type Cover in the store, and the Type Cover was hands down the winner. Secondly, you need a USB 3.0 hub if you're going to attach a debugging phone as well as a wireless transmitter for wireless keyboard and mouse. The Surface Pro comes with bluetooth, but it was easier to just use the existing Logitech mouse and keyboard than to shop for new ones. USB hubs can be powered or unpowered, and we got an unpowered one for convenience when traveling. It'll make the device drain that much faster, but having one less power adapter to carry will be essential.

In any case, so far, I'm liking the Surface Pro far more than I expect, and Xiaoqin hasn't asked for the older X201 back. I'm expecting not to send this back to Microsoft after the 30 day return period.

Wednesday, March 26, 2014

The Oculus Rift "Sellout"

Yesterday's announcement of the Oculus Rift acquisition by Facebook has already garnered negative reactions amongst gamers and game developers. If you're a bystander, you might wonder why they got such a hugely negative reaction by people who should be their biggest supporters. Introducing a new technology requires high volume and in gaming, usually requires loss-leading hardware sales in order to drive that volume. Who better than Facebook with its deep pockets and huge profits is well-suited to such an endeavor? Google and Apple have deep pockets but both would restrict such technology to their favored platforms rather than more open systems, while Facebook would be more platform agnostic than just about anyone.

The negative reaction can be explained by the principle of reciprocity. The initial kickstarter backers of Oculus Rift and game developers thereof provided a gift to Oculus Rift. The gift was intended to bring about an independent hardware platform that would be (rightly or wrongly) dominated by the requirements driven by gamers. The backers did not intend to provide venture capital for Oculus to make a quick exit, and certainly not to sell out to a big company with a history of indifference towards games, and has a platform that has historically supported games like Farmville, anathema to the hardcore gamers that comprise Oculus' demographic.

As for Facebook, this acquisition is an counter to the usual industry trends. The amount of compute power required to drive something like the Oculus Rift is enormous and power hungry. It is unlikely that the Oculus Rift can be tethered to anything less powerful than a Playstation 4 any time soon, and certainly won't be able to run on any of the laptops typically distributed to a Facebook employee, let alone the smartphones favored by the trendy. It looks geeky, is unfashionable, and looks ridiculous when worn. The only possible good it could do Facebook in the medium term is if it got them into the living room.

Corporate head-honchos at Google, Amazon, and Apple have long looked at the living room game console as the entry point to taking over the entertainment center of the home. The numbers look tempting to the corporate types. Hardcore game consoles from Sony, Microsoft, and Nintendo have only penetrated 56% of US households. The other 44% looks ripe for disruption. However, these corporate types tend to have zero passion for gaming, and most have never so much as held a gaming controller in their hands. They tend to envision something like the Ouya or the Chromecast, neither of which provide sufficient power or quality content to get 6 year-olds excited about them, let alone the hard core gamers. They fail to understand that the quality of content (whether it be a video game or high quality blu-ray viewing or streaming) is the reason why so far, the game consoles have had a huge market share for living room usage.

The Facebook acquisition of Oculus Rift runs counter to that type of corporate thinking, and might actually succeed, if it doesn't start off by pissing off so many hard core supporters that it has poisoned the well. That disadvantage is possible to overcome, but only by Facebook doing a thorough job of winning over gamers and the developers through the kind of largesse that so far, only Sony has proven to be capable of doing. Since Sony's morpheus platform would presumably be tied to Sony's platforms, the Oculus Rift is still the best hope for mass market adoption of VR technology.

My prediction is that Facebook will screw it up with gamers (it's very unlikely given its corporate culture that it would do otherwise), and 5 years from now will look at Oculus as a poor acquisition, while Sony's morpheus project will see a very small niche similar to that the Playstation Move has been. Sony simply does not have the financial ability to take big losses in order to drive market adoption, while Facebook lacks the cultural understanding of gaming to be able to do much other than to poison the well with its ideal early adopters.

Sunday, February 03, 2013

Publishing Milestone: Piracy!

Last week, a friend of mine noticed that one of my books was pirated on ScribD. In some ways this is a milestone --- I didn't expect a book whose most valuable chapter is boring tax advice to go through three editions and get as much attention as it did, let alone be worth the trouble to pirate (especially since none of the 3 editions have DRM). It is a testament to the integrity of my early and current readers that this had not happened until now. As the license to my books indicate, you are free to lend, backup, or even resell my books without any penalty. There are many countries that are famous as being "one-book" countries --- meaning that you're only able to sell one copy of the book to the country before it gets pirated wholesale. Yet I've even managed to sell multiple copies to some of those countries.

Different self-published authors have different approaches to the piracy problem. Gayle McDowell's best-selling Cracking the Coding Interview, for instance, has been so frequently pirated in India that she had no choice but to stop selling electronic copies of her book and only sell paperbacks on Amazon with a special cheaper edition for the Indian market. The externalities are clear: the pirate gains access to the book, but the rest of us lose the convenience of buying an electronic book.

For now, the indications are that my books haven't encountered runaway piracy (in India or other places), so I won't be taking any measures that drastic. However, if it does happen, I expect to have to take measures similar to what Gayle's been forced to take.

Tuesday, October 30, 2012

First Impressions: Windows Surface Tablet

For unrelated reasons, my wife and I were near a Microsoft store today and we took the opportunity to drop by and check out the Windows Surface Tablets. Given my experience with Windows 8 Pro recently I expected to be underwhelmed. Instead we both came away pretty impressed.

The tablets themselves don't look anything special. The kickstands are very nice, but if that was all I wouldn't be impressed. The keyboards, however, are amazing. The thin keyboard is nice enough that you can actually touch-type on them with no errors. It's nothing like typing on a touch-screen. Even without haptic or audible feedback I could type at about half my normal speed, which is still at least 50% faster than a typical touchscreen.

Switch-over to the thicker keyboard, however, and my typing speed went up dramatically, to about 95% of the speed as the Thinkpad X201's. The Thinkpad's got 2 extra inches, so you can see why I'm impressed. This is the first tablet that I would consider as a decent laptop substitute!

While I was unimpressed by Metro on desktop, on the tablet it shines. You can treat it like any other tablet for viewing video, or whatever. But click on Microsoft Word, and the tablet UI fades and you get a windows UI. You can bring up Internet Explorer, and cut and paste to your heart's content as though you were on a laptop. You could watch youtube on half the screen while writing in the other half. A writer could actually get decent work done on this! The machine even has a microSD slot and a full USB slot as well as a display port slot, which means you can actually do laptop-y things with it, like post-process photos from a real digital camera (not the fake stuff that comes on tablets).

The big deficiencies (and these would get me to wait for the Surface Pro) are the lack of applications. Dropbox, for instance, hasn't been ported to Windows RT yet. If you wanted to use Lightroom, Photoshop, or Adobe Premiere Elements, you're out of luck, at least for the foreseeable future. However, all these issues would go away on the Pro version of the tablet, and I could see myself buying one instead of a laptop in the future.

It's never sexy to heap praise on Microsoft, but if you've got an interest in tablets, you should definitely check one out in person before ruling it out. And if Lightroom or Picasa or an equivalent got ported to Windows RT, watch out: this may well become the tablet to get for any serious photographer!

Saturday, October 27, 2012

First Impressions: Windows 8 Pro

I'm not normally an early adopter, preferring to upgrade things after people have already eaten the beta. But Microsoft made all Windows 7 users a great offer by offering upgrades for $40.00 online, so I couldn't resist and upgraded my personal desktop to Windows 8, confident that I could deal with whatever usability snafus Microsoft could throw at me. After all, I consider EMACS a good user interface.

The purchase UI was terrible, not because it was difficult to fill out, but because Microsoft's servers were overloaded and rejected my attempts to buy the upgrade 4 times before I finally succeeded. Where's a decent monopoly when you need one?

The upgrade process was fairly straightforward. Burn the downloaded .ISO to DVD, and then run the setup file directly from inside Windows 7 (if you want a fresh start, reboot and boot from DVD). The upgrade assistant does a good job of telling you what will or will not work on your upgrade, and which tools you'd have to reinstall. It even reminded me to deauthorize my iTunes account and reauthorize it when the machine refreshed.

People with SSDs will tell you that the upgrade takes all of 18 minutes. I have a hybrid SSD, which is of no help whatsoever when it comes to OS upgrade, so I started it off and had lunch and came back to find the setup screen waiting for me.

There are several nice features waiting for you in Metro. Unfortunately, I could not use most of them because I discovered that if I linked my Windows account to my Windows Live account (which is the way Microsoft intended to use it), then my mapped network drive to Windows Home Server stopped working. Doh! Fortunately, I don't really care about sharing from the new suite of Metro Apps, so I cheerfully reverted back to my local Windows setup and my mapped network drive all worked.

The strongest point of Windows has always been backwards compatibility. Even so, a few things didn't work. I ended up uninstalling ATI's Catalyst software for managing the screens. This is not a problem, because Windows actually manages all that without issues. I had a few horrifying moments when I thought that SleepyHead stopped working. I would have happily uninstalled Windows 8 Pro and reverted to Windows 7 if this was permanent. But after swapping around some display settings and turning on the Windows 7 compatibility checkbox, SleepyHead was working, perhaps even better than usual because it stopped trying to foist those annoying updates at me.

In fact, one thing that I've noticed is that my monitors now enter sleep mode and resume from sleep mode with a stability that Windows 7 never had. It could simply be that I should have uninstalled ATI's Catalyst a long time ago, rather than using it.

The coolest thing about Windows 8 Pro? The new performance monitor. Not only does it tell me what it used to tell me, it also gives me what GHz rating I'm running at, whether TurboBoost was working, and it just looked cleaner. Nice.

Windows Home Server recognized Windows 8 (despite predating it by almost a decade), and proceeded to backup my machine just fine. That's pretty nice. I do miss the old start menu, but learning to hit the Windows Key to bring up the Metro UI didn't take much. I played around with some apps on the new UI, and to be honest, I'm not sure why I'd want a tablet-style interface on my desktop. It's just too annoying to only have one thing on screen at a time. Nevertheless, if you have 2 monitors, you can designate one for the new UI, and have the other for real work. So that doesn't suck at all. Windows-PageUp and Windows-PageDown also lets you flip the Metro UI from one window to another dynamically, and it's pretty snappy. The new Mail/Calendar app are pretty worthless, since as far as I can tell, it's not faster than Thunderbird or any other mail client, and seems more limited. The news reader apps are beautiful, but again, do I really want to use my beautiful 27" screen to host one story at a time, or would I rather have my tabbed interface on Chrome and flip between multiple stories? No contest, the traditional UI won. It is nice, however, to have games running in full screen, and some of those Metro-style games are pretty fun even with a mouse. Also, it's nice to have Google Search with voice recognition, the way it works on my tablet. Google should have integrated that into Chrome ages ago, rather than waiting for Microsoft to force them into it. The Metro Kindle App is also a thnig of beauty. But again, I didn't buy a 27" monitor to use it as a book reader.

Windows 8 was touted as having high performance boot.I tested several reboots with a stop watch, and discovered to my horror and disappointment that boot speed was abysmal, 1:45s for a machine that used to take 1:00. I re-read the linked blog post and discovered that I was doing it wrong: hitting the restart button would cause a cold shutdown and cold boot. However, a standard shutdown forward by a boot would yield much faster results, 0:45s to a usable desktop, or about half the time, which is quite a bit of a performance gain. Do this a few time so the hybrid SSD would learn the OS usage pattern, and I'd expect the boot performance to be even better.

Do I recommend Windows 8? It depends. To me, it feels like the new OS is just a bit snappier on certain tasks --- flipping between applications seem just faster. IWhat I like is the new increased stability of sleep mode for me. Sleep didn't use to be so fast and reliable. I like the faster boot speed as well. I do miss the old start button UI, but I haven't found adapting to the new UI to be onerous or horribly confusing. It's a bit jarring though: you do feel like you're flipping modes.

All in all, for $40, I feel like it's a decent deal. I wouldn't pay full price for the upgrade, especially if you don't have touch on your desktop. I wouldn't go out of my way to do an upgrade (I did, but I was also curious about the new UI), but it's nothing to run away from either. I certainly don't see it as the disaster some pundits have been saying. Mildly recommended.

Saturday, October 13, 2012

Overview: Video Editing Software

One of the strange things that happen to you as a parent is that you suddenly end up shooting lots of videos. Lots and lots of video. Gigabytes worth. Most of the time, this is not a big deal. For instance, if you shoot a short segment of your baby playing around, you can just post it on Facebook or YouTube, and you'd be done. Most of the time, you don't even bother editing the video, attaching a sound-track, or stitching multiple videos together.

That changes when you hit certain milestones, like Bowen's upcoming birthday. While you could just slap together everything you had over the past year together, chances are, you don't really want to put together all the videos and just say "done." You want to pick out certain highlights, maybe add some interstitials or captions, maybe even attach a soundtrack. If some of your shots were done in less than ideal conditions, you might want to go for contrast adjustment or color correction. If your footage was shot randomly in different formats, you would need to reconcile all those formats and output either to DVD, Blu-Ray, or MPEG format. Since I refuse to buy Apple products as long as I have a choice about it, this overview covers only Windows PC products.

For basic videos like this one of Mike Samuel riding in Downieville, the simplest option is the free Windows Live Movie Maker.The user interface is very intuitive, attaching a sound-track is easy, and you can easily caption, edit, and output to YouTube or WMV format. (Which I usually then transcode to MP4 using Handbrake) If you do most of your shooting outdoors with no color-correction needed, then this is all you will ever need and you'll be happy. I've yet to run into a clip that Windows Live Movie Maker can't handle, and I have not run into any length or capacity limitations. The software also makes full use of my quad core machine, and is parsimonious in its memory use.

When do you need more? The big one for me was color correction. If you shoot in fluorescent lighting or tungsten lighting, then just as with stills, the footage will look orange or green. Another possibility is if you shoot with formats unsupported by Windows Live Movie Maker (unlikely) or if you want multi-track audio (e.g., 1 voice track, one music track, and 1 narration track).

My brother had a copy of Adobe Premiere Pro lying around as shelfware, so I tried that first. Professional level software is great if you have professional needs. But if you're a parent pressed for time, the interface is just too much. I ended up running away from it in horror.

A friend of mine's an Adobe employee, so I got a copy of Adobe Premiere Elements 10 instead for about $25. This is the version of Premiere with "training wheels." By default, you get "Scene Mode", which basically lets you drag and drop clips into a timeline, rearrange them, add an audio track, add in title screens and then go. If you decide that's too basic, you can flip it into Timeline mode, and now you have an advanced UI to go with more advanced needs.

The tool is obtuse. For instance, to do color correction tool, you first select the "Effects" button, and then select "Auto Color", and click Apply. There's no preview, so you have no idea what you did until you hit the "Render" button to see the impact of your selection. "Render", of course, is the equivalent of "compile". It's time consuming, chewing up nearly all your CPU for minutes if not hours at a time, and then giving you a chance to see that you screwed up your settings only to try all over again. There's a three-way color corrector tool as well. Unfortunately, if you're color-blind like me, you have to use that tool with your wife standing behind you checking to make sure you didn't screw up too badly with the tool. This is not a tool for the faint of heart, but it gets the job done.

The worst part about Adobe Premiere Elements is that it is SLOW. By this, I don't just mean the frustrating "render" times. The interface is laggy, at times taking forever to respond to your mouse clicks or dragging the slider bars around as you edit your video. I have no idea what it is the software is doing underneath the covers. The only thing I could think of is that the geniuses at Adobe decided to use Ruby to write the UI and then implemented it in the most naive way possible. The software doesn't crash often, but it does crash often enough that I'm grateful for the frequency Premiere Elements "auto-saves" for you.

The most challenging part of the video editing process is selecting the clips and getting it into the Premiere Elements for you to use. You might think that since Lightroom and Premiere Elements were both Adobe tools, there'd be a simple drag-and-drop interface between Lightroom and Premiere Elements so that stuff that's flagged in Lightroom can easily be selected for use in a video. Well, you'd be wrong. There's no integration at all between the two pieces of software, which means that I'm forever clicking "Show in Explorer" in Lightroom, and then manually dragging the file into Premiere Elements. This is the kind of stuff that makes me wish that Microsoft would get into the video editing/photo editing business just so Adobe has some competition in this area.

If editing videos taxes your patience, rendering it will push your hardware setup to the limits. I have a i7-920 processor with 10GB of RAM installed. Pushing the "Export" button will make my PC go away for 2-3 hours at a time in order to render a 1 hour video. With the CPU monitor running, I could easily see that all 4 cores were in full use --- the CPU fan runs at full speed and nearly everything else on the PC slows to a crawl. I'm the kind of person who's never tempted to buy faster hardware as long as my existing computer runs, and the long render times caused me to start browsing around to see if faster hardware would reduce my pain for this once a year event. (Turns out the answer is "no": I bought my PC in 2009, and in 3 years, PC CPUs have increased in speed by only 2X --- not nearly enough to justify the upgrade. When 8 core CPUs become cheap enough for consumer use, I might revisit)

Despite all this, I'm sticking with Premiere Elements 10 for my big video projects. That's because the learning curve is so steep that once you've gotten comfortable with the software, it's not even worth considering say, upgrading to Elements 11 without a compelling feature (such as say, software image stabilization) that would make it worth paying that learning curve price all over again. Such is the state of video editing today. In any case, Intel has said for years that in-home photo editing would be a compelling reason for consumers to upgrade CPUs, and I disagree. The state of software is such that I don't see the typical user doing this, ever. It's just too hard.

Saturday, July 28, 2012

Review: Orange Internet Max (France)

As a very cheap person and proud of it, I rarely run my Nexus One in data mode when I'm at home: I'm usually within wifi range, and I refuse to pay the exorbitant $3/day or $35/month prices that local US providers charge me for. However, when traveling, I value data plans highly and would be willing to pay that price even if asked.

Last year, I had trouble getting even regular voice SIM cards, let alone Internet capable SIM cards. This year, however, we started our trip in Paris, albeit on a weekend. On a Monday, however, I went to an Orange store and got an prepaid SIM card. It cost EUR 9.95. I bought a 10 EUR refill right away so I could subscribe to the Internet Max plan (which was 9 EUR, but the Sim card only came with 5 EUR credit, and the minimum refill was 10 EUR). It's an unlimited data subscription plan that's good for a month and automatically turns off if you don't have enough credit to resubscribe! The worst part of the experience is that part where Orange tries to pretend to be Apple. You walk into the store, and are greeted by a pretty woman dressed in Orange uniform, who will put your name in a queue (driven by an iPad) so you can browse the store until a customer service rep is ready to talk to you. Unfortunately, they did this Apple-emulation strategy wrong: they had too many pretty women, and not enough customer service rep, so I ended up cooling my heels for at least 25 minutes before being able to complete an incredibly simple transaction. I would have preferred standing in line like at a normal store.

What an awesome plan it is. Most of the time, the speed is fine. Much faster the the iPhone 4G that I got as part of the home exchange program we participated in. And of course, any Android phone runs circles around the iPhone as a matter of practicality. Being able to get turn by turn navigation saved our bacon several times while driving (or walking!) around France. We were also able to tether the phone to the laptop whenever we were at a hotel without internet. Try this with your post-paid plan in the USA for less than $25/month!

The best part about this is that while Orange will try to charge you separately for e-mail, if you're using an Android phone, there's no need to pay for the e-mail plan separately. That's because the Gmail app on Android uses http requests, so it looks like browser traffic to Orange, rather than IMAP/POP, which is what Apple products use.

As an aside, after using an iPhone side by side with a 2 year old Nexus One running Android 2.3 (no I haven't bothered to upgrade the default OS yet, and probably won't --- I'm cheap with my time as well as money), it's no contest. I'd rather have a 2 year old Android phone than an iPhone when I'm in a foreign country and in need of navigation, search, and making phone calls.

Recommended. An Orange store should be the first thing you look for when you land in France.

Thursday, May 03, 2012

Review: Google Drive

I was one of the enthusiastic early users of Google Drive way back in 2007, when it launched internally at Google. It was great. I would drop stuff into it, and I could pick things up from my laptop, desktop, or if my laptop's hard drive crashed, I'd get all the data back. Thanks to the magic of VPN, I could even get those files sync'd to my home machine. I was very sad when Google Drive got canceled.

When I started Independent Cycle Touring, I discovered that Dropbox worked better than Google Docs. I managed to wrangle some more free quota, and started putting all my important files on Dropbox. (The source files for Independent Cycle Touring alone were more than 4GB, so getting extra quota was important) At last year's Worldcon, big name authors were telling newbies to get and install Dropbox and put all important work in there so it would be backed up.

With Dropbox now worth $4B, Google Drive was hurriedly revived and launched recently. My wife and I were curious, so we played around with it a bit. First of all, the UI is lousy compared to Dropbox. When you create a new folder and move files to it, there's no way to specify "Share this with someone" directly from the Windows Explorer. You have to know to visit Google Docs on the web and then select the folder and then share it. The recipient then has to move the shared folder into her "My Drive" folder before the files are sync'd to her hard drive!! This is a major botch up! On Dropbox, if you share a folder with someone, they receive an e-mail and once they click "accept", the folder is automatically sync'd to their local drive, no questions asked. It took a while for us to figure this out.

Conflict resolution is crude: we both edited a file at the same time in Microsoft Word. On Dropbox, the simultaneous edit and saves would create multiple copies of the same file with our different edits. This could be annoying to resolve, but at least you knew what happened. On Google Drive, the first copy would save just fine, and then the other person would get a "Cannot sync this file" message with no explanation.

We didn't try syncing large numbers of files, which we know works well on Dropbox, and works badly on SkyDrive (one file at a time, no smart scheduling of small files to sync first).

Conclusion: Dropbox is still the one to beat. If I was a Dropbox user, employee, or investor, I would not be worried by Google's entry into this field. If you're already a Dropbox user, there's no need to switch. If you're a Google Drive user, you should consider switching to Dropbox. (I am not an investor in Dropbox, just a happy user)