Auto Ads by Adsense

Booking.com

Showing posts with label PC. Show all posts
Showing posts with label PC. Show all posts

Thursday, September 05, 2019

Connection Options for the LG 43UD79-B

Over the last 6 months, I still have not found a monitor as good as the LG 43UD79-B.  It's huge and does not require font scaling at 4K, and performs well for writing, photography, and video editing. I feel like I'm going blind every time I have to use anything smaller.

The monitor, however, comes with 4 HDMI ports and only 1 display port as well as a single USB C port. 2 of the HDMI ports are special, allowing for 4K60fps. While the display port is reliable and has always worked, as soon as I acquired the XPS 13, I wanted to be able to connect it with the monitor. This demonstrated that several of the cabling standards are anything but standard or reliable, so I'm documenting everything I tried.

The first thing I tried was a Keychron 10 in 1 USB Hub adapter.  This one was very appealing, because it was cheap (about $40 if you find a coupon), had 4K output, charge the laptop, as well as providing 3 USB hubs and an ethernet jack as well as a VGA port. (nowadays, monitors don't even come with a VGA port, but you might have a legacy monitor sitting around) The problem? No 4K60fps. 4K30fps max. Back it went to Amazon.

Then I tried the Cable Matters USB C with DHMI, DisplayPort, VGA, and Ethernet adapter.  No HDMI 4K60fps either, even though the very same adapter's Display Port would give you 4K60fps. But the DP port was already spoken for by the desktop machine, and seriously, if I was going to do that, why not get the Amazon Basics adapter, which is way cheaper?

Finally, the uni USB C to HDMI 4K60fps cable worked. I went for the short 3 foot cable, and this one was reliable, and gave me 60fps. At $19, it's not the cheapest cable in the world, but as I've already discovered not all HDMI cables are made equal, so if you need one of these, this is the one to get.

I'd tried several USB-C cables that claim to be alt-mode compatible. This has great appeal, since the cable has the potential to provide USB, display output, as well as charge the laptop, simplifying your setup to only one cable. To my surprise after the previous years of trying one cable after another and failing, the cheap ($12) Nekteck USB C 3.1 Gen 2 cable worked, delivering me 60fps.  To my disappointment, the XPS 13 wouldn't charge from the monitor, as the monitor only delivered 7 watts of power!

So there you go, I've got 2 cables that'll work with 4K60fps, and 2 that wouldn't! Compared to last year, the situation is definitely improving!

Friday, June 16, 2017

Building a custom desktop

In recent years, performance on consumer desktop processors have pretty much stagnated, so I felt comfortable sticking with my 2009 HP m9600t. That machine's had several upgrades: to SSDs, additional RAM, and a new GPU. It's had hard drives added and expanded, and a blu ray drive when those became cheap. Over the past year, the ethernet port went out, so I added a PCI ethernet card to it. The machine's gotten flakey over the years: it no longer slept or hibernate, waking up whenever it put itself to sleep. While this was annoying, I lived with it by hard powering off the machine every time I turned it off. For a while, it wasn't a big deal as I used a laptop for non hard core tasks (any Adobe software).

Then I won a Gigabyte Z170X Gaming 7 motherboard during a web-site lottery. The motherboard was not the latest, but it's over-clocking enabled, and was forwards compatible with the latest intel desktop processor: the Core i7 7700K CPU. Since I had a friend at Intel who could supply me with the processor (and a PCIe m.2 SSD) at employee discount prices, I figured i could build a new machine. Note that in most cases you can usually purchase equivalent machines from Dell or some other white label device at a better price than putting it all together yourself. Those manufacturers, however, typically skimp on parts. For instance, the case of my HP m9600t was so tightly packed that I pinched my fingers every time I installed or replaced a part. Similarly, the PSU is usually not an energy efficient PSU. Back in the old days, you wouldn't keep a PC around long enough for a more efficient PSU to pay for itself, but now that you typically keep a PC around for 10 years or more (because of the failure of Moore's law) there's no reason not to pay for a better unit, especially if it's quieter.

As a first time PC builder, I went with the Fractal Design Define R5 case. I picked it because it was a fairly large case, which meant no more pinched fingers. It comes with 7 3.5/2.5" drive trays and 2 5.25" drive trays, which I figured would be sufficient capacity even for a storage-hungry photographer/video processor. The case was indeed huge but to my surprise was well balanced and easy to handle. It also came with an ample set of screws and nice features such as being able to change the direction in which the front door opens.

The instructions start with screwing in the power supply, which apparently is a fixed size in PCs. Mys elected power supply screwed in just fine, and then I plugged in the power cord and then grounded myself using an anti-static strap. Next came the motherboard. Plugging in the processor was easy, but then the cooler felt like you had to be much more careful. I'd acquired both a water cooler and an air cooler, but at the last minute went with the air cooler for simplicity, so I wouldn't be managing 2 pieces that are attached to the motherboard. The air cooler was interesting because it had multiple orientations, and you're supposed to point it up or out of the case for better airflow, so I played some 3D rotation games before I settled on "up." I then plugged in the memory and the SSD. The SSD is weird because the motherboard had a bizarre table which showed what configuration of SSD installation would preclude the use of which other SATA slots and/or reduce the speed of the PCIe SSD. I found myself thinking: "Really, Intel? Really?!" Apparently this has been fixed in the latest Z270X motherboards, but of course, I wasn't going to buy one when I had one for free. But the next step after selecting the right slot really puzzled me.

All motherboards come with a back plate. You're supposed to insert it into the case, and then insert standoff screws into the motherboard and then insert the motherboard and then screw it down. What I was surprised by after having such an easy time with the processor, cooler, memory and SSD was how much I had to wrestle the motherboard and backplate together into the case and make everything line up. You have to tighten down the screws because otherwise if you insert or remove display cables or USB cables from the computer you'd shift or move the motherboard, which would not be good. I did so without damage (I thought!).

Then I started plugging in cables into the motherboard. The manuals here just don't help much. For instance, some of the case fans have only 3 holes while the corresponding motherboard pins have 4! I had to do some googling around before figuring out which 3 pins should be used. Similarly, for many of the single jumper cables I practically needed magnifying and tweezers to get a 5mm cable plugged into a pin squeezed into a 8mm space. This was definitely a pain. This was also where spending lots of money on the case helped. The Fractal Design case had rubber grommet windows where many of the cables were already pre-wired to run correctly. Unlike my HP, where there were cables everywhere, you could place only the cables you needed and route even those cables under the motherboard, so you had nothing hanging on top. Working on this was a pleasure.

Then came the moment of truth: plugging a display cable in and seeing if the machine would POST. To my horror, when I powered it on, the fans spun up and then spun down. Something was horribly wrong. I googled around and finally figured out that I'd made the rookie mistake: I had forgotten to plug the CPU power cable in. For whatever reason I thought that giant 24pin cable plugged into the motherboard ought to be sufficient. It's not. I plugged in 2 4-pin cables into the motherboard socket, and the device posted!

After that, the rest of the process was easy, though I was disappointed that the "backside of the motherboard" 2.5" SSD trays didn't actually fit 2.5" SSHD drives. But I moved over the blu ray player, intalled 3 HDDs, and still have room and power left over for more.

After installing Windows 10 (which transferred the license over with no issues), the machine sleeps and hibernates with no issues and is also incredibly quiet. I tried over-clocking it a little with no issues, but probably won't do too much. Lightroom and Premiere Elements 12 now fly! A usual, the storage upgrade to a PCIe SSD was probably more responsible than the mere 3X increase in CPU performance.

I haven't installed a GPU yet so am relying on the built in Intel GPU which many enthusiasts love to complain about. I am still of 2 minds as to whether to decommission the old machine or to let my son use it, but if the latter I can take my time to shop for a GPU.

I must say that over-all, the process has been much easier than I expected, and some of it was (dare I say it) even fun! Just like with a bicycle you've built yourself, there's something special about a machine you've built yourself. I expect that this is probably the best approach if you're not in a hurry for a machine and have time to shop. My wife's Dell now sounds loud by comparison, while my old HP sounds like a jet-engine whenever it does anything compute intensive. Given the changes in the PC market over the past years, I fully expect this to be the correct approach going forward.

Monday, January 23, 2017

Review: Gears of War 4 (PC)

Over the last few years, Microsoft's been increasingly making rational business decisions. One of the less seemingly rational ones, however, is the introduction of XBox Play Anywhere. The idea is that if you bought an XBox game, you'd be able to play it on both an XBox and a Windows 10 PC (well, a PC that had a decent graphics card, at least). In theory, this is a nice perk for folks who've bought completely into the Microsoft eco-system. Except that I don't know why you'd have both a high-end graphics card PC system and an XBox.

There are several problems with this: first of all, if you have a capable PC, this definitely means you won't buy an XBox. Maybe Microsoft doesn't care, or maybe it's just a side effect. The second issue is that the PC ecosystem is still wonky. Some games, (e.g., Quantum Break), may or may not play on your system.

In any case, I wouldn't normally pay the absurd prices digital vendors ask for consumables such as video games. But over the holidays, Microsoft ran a series of Microsoft Reward specials that enabled me to convert Microsoft Rewards points to cash for the Microsoft App store. Since I had 20,000+ points, I took full advantage and ended up with a large app store bonus. The Microsoft app store, sadly is lacking in useful programs, so I picked up Gears of War 4.

Gears of War 4 belongs to my favorite genre of shooter games: the 3rd person cover-shooter. Gears of War has the reputation for being the series that introduced this genre to the world, so I downloaded the multi-gigabyte download to my D drive and started up the app. PC gaming has a reputation for being very complex, requiring lots of tweaking and tuning in order to maximize image quality while still retaining a high enough frame rate to be acceptable. To my surprise, out of the box, the app detected my system settings and picked a compromise that I could not casually improve in about 10 minutes of playing with the dials and sliders available in the settings screen. That made me feel like Microsoft had really done its homework.

Then, when I started up the game, upon the opening titles starting up, the game crashed. Not only did it crash, it crashed without a dialog box, without a log file for me to look at, or even any indication that there was anything wrong. The system snapped back to the desktop as though I'd quit using a keyboard short-cut. Not cool. I searched around for a solution but couldn't find an answer. I eventually stumbled upon this: a Universal Windows Program (UWP) game cannot be installed onto any drive other than C in order to run. What's this? Did we regress to the mid-1990s, where everything had to be installed into the C drive? Wow.

Other than that, the program had been mostly stable. In the last act of the single player campaign I ran into hard system crashes, but then again, my 8 year old PC is starting to get flakey in general, so maybe that's to be expected. In any case, once I figured out the C drive issue I could play, but I can certainly understand how the PC gaming ecosystem got its reputation as being unfriendly or even user-hostile, on top of being expensive and bulky.

The game itself is fun. Here you have to split your understanding between "fun as a game" and "fun as a movie experience." Games nowadays have movies driving a plot in between playable parts. Games like Uncharted 4, The Last of Us, or Batman Arkham Asylum have excellent plots, fantastic pacing, and a nice balance between game play and movie watching so you're never bored and have a good experience. A game like Rise of Tomb Raider might have better game play (including more complex but satisfying resource management systems), but much worse writing and plotting. Gears of War pretty much says, "Forget the story --- it's just an excuse to dump you into the Game Play loop."

Gears of War's game play loop, however is pretty bland! There are no resource management issues: you fundamentally have to pick up ammo or switch weapons. Sometimes the weapons left for you on the battlefield are a hint as to what's coming up. Several times, you have to play a "hold the line" scenario, in which you can deploy fortifications which can help you hold the line and even carry over resources from one wave of enemies to another. These are particularly fun and can withstand repeated play. But that's it. Now it's been a while since I got a cover shooter to play, so I had a lot of fun, but there's no way you would pick up an XBox just for this game, nor would you even bother paying money for it, since it's something other games do a much better job on. Nevertheless, as a freebie, it's a game that doesn't waste a lot of time, jumps straight into what it does best, and gives you loads to do.

There are a few mechanical niceties. First of all, the game always gives you at least one companion character at all times to play. Those companion characters can even take care of themselves and each other, as well as saving you if you get hurt badly (you can also crawl back to one of them to get "rescued."). Then I noticed the enemies doing the same, so the mechanics apply to them as well. Very sweet. The game is much less lonely than a Batman game or a Tomb Raider game as a result, which is a very good thing.

The story and characters aren't much to go by, though some of the banter is great at making fun of the game itself. The boss fights are fair, and the game never overstays its gimmicks. In short, this is competent, polished work. Just not inspired. If you have an XBox One (or a gaming capable PC) anyway, a sale might make this worth picking up. And it is one of the few games where configuring the graphics settings isn't an exercise in frustration. As such, I can recommend it if you enjoy 3rd person cover shooters.

Friday, December 18, 2015

Review: Her Story (PC)

I try not to play games on the PC: it's difficult to play because I get distracted by e-mail and other work. The high resolution and close seating position of the PC means that even though I have a relatively powerful PC, it can't quite drive the display at high enough resolutions that I get a satisfying experience without spending more money on graphics cards, which I'm not prepared to do.

Her Story, however, is an exception. First of all, it's a game completely driven by the keyboard. That means that the PS4 (or any other console) would be a poor experience. Secondly, it's deliberately shot in low definition video, so doesn't depend on high resolution for an experience. Finally, it's short: I finished it in a matter of a couple of hours, and it's fun enough for me to recommend.

The game sits you down before a 1995 era police database/computer. By typing search terms into the computer, you get access to video clips of various police interviews with a woman who starts off by coming in to report her missing husband. You then use search terms to follow threads of investigation to figure out what happened.

The story and investigative process is actually interesting: sometimes your query terms exceed the number of results that can be displayed on screen, and the game restricts you to only 5. So you end up refining the search terms in order to uncover more of the story. Your thoughts about the investigations start changing as you come up with and then discard various hypothesis and follow through. My one criticism is that I didn't think the actress was actually very good: some of her mannerisms were a little forced.

I was wondering how I was going to figure out whether I was done with the game (i.e., I was running out of things to discover), when the game figured that out and hit me with an IM which told me about who was sitting in front of the computer. After the credits roll, you're given a key with which to unlock more search results, which led me to go back through my search history to find pieces which I had missed, and also gave me a nice history of my investigation and thought process.

Her Story is priced at $5.99 on Steam. I picked it up for $3.84 during a sale, and at that price, it's at least as good as any movie rental I've made over the past years. Recommended.

Friday, August 29, 2014

Review: Anker X201 Replacement Battery

Ever since I got Xiaoqin her Surface Pro, she's relinquished my (by now ancient) X201 back to me. Laptop batteries usually get killed by heat and being fully charged, and the X201's was no exception. A few months of using Android Studio on the laptop while being plugged in killed the original OEM battery, which would have been down by 40% in regular unplugged use, but was down to 10 minutes of run time. The thing about the X201 is that the keyboard's still the best you can get for a laptop of this size and weight, and writing doesn't consume lots of CPU power, so I use this as my primary writing machine, relegating the desktop to heavy-duty work like Lightroom, Premiere Elements, and InDesign. Even with 2 Moore's cycles in place, newer laptops still have not caught up to my 2009 desktop's performance.

For a fairly new laptop, it would have been worthwhile to hunt down an OEM battery and pay full price for it. For a laptop that's been well-used (albeit upgraded), I settled for the Anker X201 replacement. Anker warranties the battery for 18 months, but the biggest problem with non-OEM battery is that they have a melt-down which could set your laptop on fire. I've had a 3rd party Macbook battery warp so badly that it wouldn't fit in the battery slot any more after I wrestled it out of the slot. Of course, nowadays, Macs don't come with user-replaceable batteries so the optimal solution would be to throw away the Mac and buy yourself a real computer with replaceable parts.

The battery plugged in snugly into the battery well, and surprisingly, the Power Manager on the Thinkpad recognized the battery! I didn't expect that and was impressed. The manager says that the battery's good for 47.34Wh while the specifications claimed 49Wh, indicating some minor deterioration while the battery was sitting at Amazon's warehouse. On initial charge, the battery indicator said the battery's only good for 2 hours, but after 4 charge cycles it now says 3 hours. This is more than good enough for my general use of the laptop, and comparable to the OEM battery.

Newer machines such as the Surface Pro and Macbooks no longer have user-replaceable batteries, making it worth while to hang onto older machines such as the Lenovo Thinkpad for as long as you can. The Anker goes a long way towards helping that out. Recommended.

Saturday, August 23, 2014

Convergence

Microsoft's Surface Pro is a bet that convergence will lead to a device that blends a laptop and a tablet. I'm both a fan of the Surface Pro and the Dell Venue 8 Pro, both of which do things that neither tablet nor phone can do. However, I believe that Microsoft's approach is at best flawed.

The convergence I'm betting on is between tablet and phone. I noticed this when my wife, who owns a Galaxy Note 2, the above mentioned Venue 8 Pro, and the Surface Pro would either use the Galaxy Note 2 (for general surfing, quick purchases, or Facebook) or the Surface Pro (for general content creation). But her mode of use of the Surface Pro is that of a desktop: usually tethered to a large monitor, rather than as a tablet. While many have complained about the battery life on the Surface Pro, she's never even noticed, indicating that the disconnected operation time is minimal.

This makes sense: the Galaxy Note 2 is already fast enough compared to the Venue 8 Pro, and the screen size at 5 inches is also comparable that to the Venue's 8 inch form factor. For watching movies, etc., they're both already pretty good (though the Note 2 is a 1080p while the Venue 8 Pro's a bit less at 1280x800), and the Note 2's handy stylus is much easier to access than the Venue 8's. The reason why 8" tablets took off is the price: while the Galaxy Note 2 was close to $600, you can get a Venue 8 Pro at around $200, or a third of the price. But you sacrifice considerable functionality to get there: you no longer have always on internet, the resolution of the screen doesn't go up commensurately, and things like bluetooth are much clunkier.

More importantly, you only have so many hours in the day, and you've got a lot more experience with the phone than the tablet. So even though the tablet might be better for some things (e.g., the Venue 8 Pro's tablet's browser is superior and is a real web browser, unlike the Galaxy Note 2's), you might not waste time picking it up unless there was a specific use case, such as a web site that just refuse to be viewed via the Galaxy Note 2's.

So what I think Microsoft needs to work on is a 5.5" (or 6"!) phone running a full on version of Windows with appropriate software. Such a device might even have a port for an external monitor. At $600-$800, such a device would clearly be superior to existing tablets and phones, and I might even consider getting one. This device could easily eliminate the need to carry a phone, tablet, and laptop. Of course, getting sufficient battery life and power out of such a device might be a technical challenge, but it's one that's suited very much to Microsoft's engineering team.

Monday, April 28, 2014

Review: Android Studio

Since helping my wife with her Nutrition Tracker App, I'd had a chance to try both Eclipse and Android Studio for android app development. Both of them run on Windows, my preferred platform, but it didn't take 3 days with Eclipse before I got frustrated with frequent crashing, features not working, and a lousy layout tool. I found Android Studio and downloaded it and soon persuaded my wife to switch to it.

Android Studio is based on IntelliJ IDEA. Back at Google when I was doing Java work, I avoided it like the plague, preferring to stick with Emacs and gtags. That's because Google's Java source base was so big you couldn't possibly load it into IntelliJ on the puny workstations of that time (yes, Google only supplied machines with 2GB of RAM in 2003), and even if it had been possible, those machines would have slowed to a crawl under the load of processing that much code. IntelliJ/Eclipse die-hards were resorting to wacko tricks like subsetting Google's code base so it could load into IntelliJ and then writing plugins into gtags for accessing the rest of the source code. I have no idea what Googlers do today, but my suspicion is that things haven't gotten much better.

For small Android projects like Nutrition Tracker, however, an IntelliJ is just about right. If you're unfamiliar with the Android API, it would supply you with method name completion, tell you which arguments to supply in which order, automagically add imports, allow for automatic refactoring tricks such as moving methods, renaming variables safely, and moving inner classes out of their outer classes, shifting classes between packages, etc. The layout tool helps you avoid having to learn the lame layout XML language, so you can actually try to make things work (as opposed to making things look pretty and usable --- I think Emacs is a great UI, so I have no expertise on those topics).

Android Studio is slow. It's slow to startup, it's slow to compile, and it's slow to run the debugger. A typical edit-compile-debug cycle would take around 10-20 seconds in order to build a tiny app. Note that I'm not complaining about Android Studio's massive use of resources while I'm editing. I think that's entirely appropriate. I want all my 4 cores/8 threads to be utilized in order to make my coding experience faster and more pleasant. I don't even mind the startup, since it doesn't need to happen that frequently, and it's a one time cost. But the Gradle build system is not only a resource hog, but it introduces additional latency into my think-time, so I begrudge every second it's spending traversing dependency graphs instead of actually compiling code. I have no idea why the Android Studio engineers chose a clunky system like Gradle, as opposed to rolling their own and integrating it fully into the IDE. I never want to edit the gradle build files manually, but the system forces me to. What's more, the syntax is really obscure and the documentation is inadequate.

For instance, when doing an android release, the documentation only covers Eclipse. Worse, the documentation lies to you. It tells you to modify your Manifest file, and I did. Until I kept scratching my head as to why that never worked. It turned out that you had to modify the Gradle config, since the Android Manifest XML file was ignored in the presence of Android Studio. Well, that took so much googling around that I can't remember what search term I used to uncover the Stack Overflow answer any more.

The source control integration is also funky. It supports Git, Mercury, and Subversion, but not Perforce. Given that Google uses Perforce internally, I surmise that Google's internal projects do not use Android Studio. This does not bode well, since that will mean that Android Studio's most serious problems (build performance) will most likely never get addressed because its non-existent internal customers will not feel the pain.

For quick and dirty Android projects, Android Studio is probably the best there is. If you're serious about building an Android app, however, my suggestion is that you use Emacs and roll your own build system that's decently fast. Otherwise, the benefits from using an IDE will be swamped by inordinately long compile/edit/debug cycle times. Note that though my machine is old, it's still decently powerful compared to even the fastest rig today, let alone the kind of laptops most "hip" developers favor, so it's unlikely you can solve Android Studio's problems by throwing more hardware at it.

Recommended only for non serious Android projects. It's a great tool for getting started quickly though, so use it to bootstrap yourself into doing Android development if that's one of your goals.

Wednesday, April 16, 2014

First Impressions: Microsoft Surface Pro

Our trusty X201 had been getting long in the tooth, and Xiaoqin decided to try some Android development. If you've ever tried Android Studio, you'll know that it's a CPU intensive process since it's based on IDEA IntelliJ. The build system associated with Android Studio, Gradle, is also a massive CPU hog, and introduces no small amount of latency to the process. I never thought I'd miss the days of Visual Studio, but it's quite clear that compared to Microsoft's tool set for development, Android is quite a bit behind, and extremely unstable. Of course, in terms of market share, the positions are exactly reversed.

After trying out a Surface Pro in the store a year or so back, I vowed never to buy a conventional laptop again if I could buy a Surface Pro-type device. Fortunately, Microsoft was having a sale on refurbished Surface Pros, so I was able to pick up a 128GB model for $450. You can find them for about $500 if you're willing to put up with a 64GB model. With USB 3 and a microSD card slot, it's probably no big deal if you can't find the 128GB model.

As a laptop, it's quite impressive. It's about 50% faster than the older X201, and 3X faster on boot up, hibernation, and recovery from hibernation, with boot times going from 30s to 10s. And yes, this is with the X201 upgraded to an SSD. There are a few strange fit and finish issues, such as the mini display port slot not being very deep, so when inserting a standard cable there's a little bit of chrome sticking out. The tablet comes with a pen, but there's no place to put it except in the magnetic charging port, and the magnetic charging port isn't strong enough to retain the stylus without loss if there's any pressure whatsoever on it. Since this is an expensive Watcom digitizer stylus, you really do want to keep track of it!

Running Lightroom is fast as you might expect, with no hitches and the Surface Pro had no problem driving the 27" HP monitor with a 2560x1440 display. One nice mode you can run is to have the touch screen run the Start screen, while the big display runs the desktop. This gives you a nice touch UI for the touch part, while having the desktop to do real work. Of course, Microsoft had to glitch this up---in this mode, desktop apps still launch onto the small screen instead of automatically selecting the big screen. It's this kind of inattention to detail that gives Apple its edge over Microsoft, though I've found Macs to have their share of problems when using multiple screens.

The device has a fixed, 4GB of RAM, but surprisingly, until I told Xiaoqin about it, she didn't even notice it didn't have as much RAM as her old device. At least part of the reason is that Windows 8 Pro actually consumes fewer hardware resources that Windows 7 did. The other part of it is that in recent years, software developers just haven't been able to assume more than 4GB of RAM anyway, so as long as you're single tasking or running just one web browser and an application, you're generally OK.

As a tablet, the Surface Pro is quite hefty, though not as hefty as the X201. It makes up for that, however, with power. I'd already written about how much faster the Dell Venue 8 Pro is than my Nexus 7. Using the Surface Pro is instantaneous. The Type Cover is also a joy to use, giving you keyboarding performance akin to what I'm used to with the X201.

The real revelation, however, is the stylus. I'd never tried any of the previous PCs in tablet mode, other than my use of the Wacom Bamboo tablet for producing Independent Cycle Touring. But while I hadn't noticed, Windows' handwriting recognition has become nothing short of amazing. My handwriting can compete with any doctors' for sheer inscrutability, but the Surface Pro handled my cursive with aplomb, as long as I was writing common English words. Write something not in the dictionary, and just like any other machine translation program, and you end up with gibberish. There was no training period, however, and I could pick it up and use it. You could even turn on Chinese handwriting recognition, though Xiaoqin pointed out that Pinyin is faster and much easier to use with a real keyboard. Unfortunately, having multiple languages on the machine is problematic if you use a keyboard, since Microsoft used Windows-Space to switch between languages, and Xiaoqin found it far too easy to hit that combination by mistake. In past versions of windows we tried to change the language key bindings but to no avail, so we gave up and uninstalled the language pack instead.

All tablets are compromises. The Surface Pro does not have great battery life. 3-4 hours with Android Studio and that's it for the battery. When fully powering Android Studio, the device also gets hot enough to turn on its fan, which sounds like a low hissing noise. It's quieter than the X201, but still noticeable if the room is otherwise quiet. Next to my Core i7 920 box going full bore, of course, it might as well not make any noise. At no point would you burn your hand grabbing the Surface Pro, however, so there aren't any safety issues.

Long term, the biggest concern about the Surface Pro is the battery. With the machine running hot, and the battery fully charged most of the time in desktop mode, I would be surprised to see more than 3 hours of battery run time after the first year, and 2 after the second year. Most laptop batteries get abused this way as well, but the Surface Pro has a non user-serviceable battery, with the only option being the $200 power type cover. Fortunately, for the price (which is much less than what I paid for the X201 way back when), we can treat the Surface Pro as a disposable computing device. This is much more of a concern nowadays, however, than it would have been 10 years ago. 10 years ago, you'd expect to replace a machine every 3 years. Now, an adequate machine (which the Surface Pro most definitely is) could have a potential life time of 5-6 years. At the rate Intel is improving (or not improving) CPU performance, I'm likely to keep my desktop for another 2-3 years at least!

There are a few accessories that I would recommend for the Surface Pro. The first is a Type Cover. We tried both the Touch Cover and the Type Cover in the store, and the Type Cover was hands down the winner. Secondly, you need a USB 3.0 hub if you're going to attach a debugging phone as well as a wireless transmitter for wireless keyboard and mouse. The Surface Pro comes with bluetooth, but it was easier to just use the existing Logitech mouse and keyboard than to shop for new ones. USB hubs can be powered or unpowered, and we got an unpowered one for convenience when traveling. It'll make the device drain that much faster, but having one less power adapter to carry will be essential.

In any case, so far, I'm liking the Surface Pro far more than I expect, and Xiaoqin hasn't asked for the older X201 back. I'm expecting not to send this back to Microsoft after the 30 day return period.

Friday, April 04, 2014

Review: Pixeljunk Monsters Ultimate HD

I'm a big fan of the tower defense genre of games, and have sunk countless hours into the original flash-based game, as well as much nicer looking Defense Grid. Strangely enough, there aren't very many mobile versions of the desktop tower defense that are worth playing, with Space Station Frontier being the only one that's really any good. (I found a common recommendation, Fieldrunners HD to be extremely lacking in imagination compared to even the original flash-based game --- other games were filled with in-app purchasing which sucks all the fun out of gaming, as far as I'm concerned)

That is, until I picked up Pixeljunk Monsters Ultimate HD for my Vita. The game's also available on PC, but I feel it is best played with a joystick and control buttons rather than mouse and keyboard, so I would encourage you to get a controller for the PC, if that's your platform of choice. Obviously, the PC version would be much less portable than the Vita version.

This game is extremely challenging. On Easy difficulty, I found myself having to play most levels more than once. In particular, sections of the game are locked away unless you score perfect scores on a certain number of pages, so I found myself replaying levels over in search of the score. On the easy difficulty at least, the game offers multiple paths to victory, with a great selection of different towers to use on various challenges.

The game features several unique mechanics. The first is that you control an actual character on the game board, which constraints you several ways. Your character creates and sells towers, as well as picking up gems and coins (to purchase new towers). Your character can also upgrade towers by dancing on a tower, or you can upgrade towers with gems. Your character also needs to run back to the base in order to purchase new tower types. With all these constraints in place, the game ensures you always have a plethora of choices to make, which forces you to pay attention to every detail.

Another unique mechanic is the concept of earned interest between waves. By not spending your coin, the game grants an interest bonus between waves that generates more coins. This mechanic basically adds tension between setting up additional towers right away or waiting for the last moment in order to garner the most bonus interest possible.

Your character can also get run over by the invading monsters, which would cause you to lose a number of coins as well as rendering him incapacitated for a period of time. If you don't pick up coins or gems within a certain amount of time, they'll disappear off the game board, which again generates a certain level of tension between running after the coins or staying put and upgrading towers by dancing.

The monsters are typical of the genre, with slow, fast, numbers, shielded, and flying monsters ensuring that you have to build a variety of towers in order to win.

Now, the most important part of a tower defense game is the maps or gameboards. The game features approximately 21 game boards, and the variety between them is pretty cool. They range from the difficult to the nearly impossible to get through without taking losses, and are ranked in difficulty with certain special missions granting you additional tower types or upgrading your character.

The game's extremely replayable, will be free this month on Playstation Plus if you have a Vita. In any case, I definitely got my money's worth for the game and can recommend it.
UPDATE: PixelJunk Monsters is available for $5 on the PC/Mac directly from the developer. This sale will last for 5 days.

Wednesday, March 26, 2014

The Oculus Rift "Sellout"

Yesterday's announcement of the Oculus Rift acquisition by Facebook has already garnered negative reactions amongst gamers and game developers. If you're a bystander, you might wonder why they got such a hugely negative reaction by people who should be their biggest supporters. Introducing a new technology requires high volume and in gaming, usually requires loss-leading hardware sales in order to drive that volume. Who better than Facebook with its deep pockets and huge profits is well-suited to such an endeavor? Google and Apple have deep pockets but both would restrict such technology to their favored platforms rather than more open systems, while Facebook would be more platform agnostic than just about anyone.

The negative reaction can be explained by the principle of reciprocity. The initial kickstarter backers of Oculus Rift and game developers thereof provided a gift to Oculus Rift. The gift was intended to bring about an independent hardware platform that would be (rightly or wrongly) dominated by the requirements driven by gamers. The backers did not intend to provide venture capital for Oculus to make a quick exit, and certainly not to sell out to a big company with a history of indifference towards games, and has a platform that has historically supported games like Farmville, anathema to the hardcore gamers that comprise Oculus' demographic.

As for Facebook, this acquisition is an counter to the usual industry trends. The amount of compute power required to drive something like the Oculus Rift is enormous and power hungry. It is unlikely that the Oculus Rift can be tethered to anything less powerful than a Playstation 4 any time soon, and certainly won't be able to run on any of the laptops typically distributed to a Facebook employee, let alone the smartphones favored by the trendy. It looks geeky, is unfashionable, and looks ridiculous when worn. The only possible good it could do Facebook in the medium term is if it got them into the living room.

Corporate head-honchos at Google, Amazon, and Apple have long looked at the living room game console as the entry point to taking over the entertainment center of the home. The numbers look tempting to the corporate types. Hardcore game consoles from Sony, Microsoft, and Nintendo have only penetrated 56% of US households. The other 44% looks ripe for disruption. However, these corporate types tend to have zero passion for gaming, and most have never so much as held a gaming controller in their hands. They tend to envision something like the Ouya or the Chromecast, neither of which provide sufficient power or quality content to get 6 year-olds excited about them, let alone the hard core gamers. They fail to understand that the quality of content (whether it be a video game or high quality blu-ray viewing or streaming) is the reason why so far, the game consoles have had a huge market share for living room usage.

The Facebook acquisition of Oculus Rift runs counter to that type of corporate thinking, and might actually succeed, if it doesn't start off by pissing off so many hard core supporters that it has poisoned the well. That disadvantage is possible to overcome, but only by Facebook doing a thorough job of winning over gamers and the developers through the kind of largesse that so far, only Sony has proven to be capable of doing. Since Sony's morpheus platform would presumably be tied to Sony's platforms, the Oculus Rift is still the best hope for mass market adoption of VR technology.

My prediction is that Facebook will screw it up with gamers (it's very unlikely given its corporate culture that it would do otherwise), and 5 years from now will look at Oculus as a poor acquisition, while Sony's morpheus project will see a very small niche similar to that the Playstation Move has been. Sony simply does not have the financial ability to take big losses in order to drive market adoption, while Facebook lacks the cultural understanding of gaming to be able to do much other than to poison the well with its ideal early adopters.

Sunday, March 09, 2014

Review: Telltale Games The Walking Dead Season 1 (PS Vita)

I got The Walking Dead as part of my PS Vita package. However, the loading times for the game was so long and the loading frequency was so high that I gave up, only to restart again when The Wolf Among Us persuaded me to give it another shot.

Just as with The Wolf Among Us, The Walking Dead is an old style point and click interactive adventure. It claims that the game adapts to the choices you make, and that your decisions can affect the rest of the game down stream. This is true, but only to a very limited extent. For instance, characters will remember the decisions, speeches, and actions that affected them and talk to you about them later. Sometimes, you might have to choose between rescuing one person or the other, and that decision will carry over to the next episode. However, the combinatorial explosion from providing a fully branching story-line would be too much even for the 2.5GB of storage the game consumes on your memory card, so the game cheats.

The problem with this cheating is that it robs you of the game's promise, and it's particularly obvious during emotionally tense moments of the game. For instance, episode 3 has a mystery that the player solves very quickly by meta-gaming: there's one character you're not allowed to interview or accuse, and of course, that character did it. Well, that's frustrating by itself, since there's a reveal in episode 4 and you're supposed to be surprised. But the worst thing about this set up is that the result is that one of the characters you saved previously is shot and killed instead, and another character is left behind. It's one thing to be not smart enough to solve the mystery, it's another to not be allowed to solve the mystery even when you know what happened and who to accuse.

If this was an isolated event, I'd be inclined to forgive and forget. But something extremely similar happens in the last episode, where despite your pleading to the contrary, the plot moves ahead and removes agency from you. Now, you'll note that I was more than happy to forgive and forget Uncharted 2 or The Last of Us despite both games being essentially linear with zero control over the story, but The Walking Dead kept reminding me at the start of each episode (there are 5 in total, plus a 6th collection of related short stories) that I'm constantly reminded of the failure to fulfill that promise. Furthermore, both those afore-mentioned games are primarily action games which do their jobs really well.

The Walking Dead, however, is full of technical glitches, at least on the Vita. The game frequently stutters, sometimes even loops, and has unsatisfying controls. If you use the touch screen controls, you don't get to selection actions on objects when you touch them, but at least the "action" portion of the games like shooting zombies is easily achievable. If you use the joystick controls and buttons, you have much finer control of the action, but the "action" portion of the games (like in episode 4) are virtually unachievable, taking me upwards of 6 tries. This would be fine if the game allowed you to use both interfaces at once, but no, you have to pick one or the other and can't switch during the episode.

In any case, I cannot recommend this game for anyone other than die-hard fans of the comic books or TV series. The story is decent, and many have reviewed the game as having a better story than the TV series, which saves me the time of having to ever watch the TV series. Even for those die-hard fans, I would suggest either the PS3 version or the PC version, with the PC version preferred for $8.50. If you can wait for a steam sale you can get the games for under $5. The reviews for the game online are nothing short of stellar (and Sony believes those reviews, since it created a bundle for the holidays), but for this reviewer anyway, I felt the premise of the game was not delivered because of the technical problems and the ham-fisted approach to plot. Not recommended.

Wednesday, March 05, 2014

When will the next generation of game consoles launch?

While I'm thinking about why game consoles are likely to be around for quite a while, it was instructive for me to take a look at what the most likely launch period for the next generation of game consoles would be. This is discounting Android consoles such as Ouya, the future Amazon game console, and whatever Apple or Google cooks up. My expectation is that those game consoles will be under-powered and unattractive to hard-core gamers and will not attract AAA titles.

The one possibility that could derail my predictions is if Steam boxes take off, but given that Steam boxes will run Linux, and not support much of the existing game library, I do not expect them to be a major player. Steam boxes have a bunch of issues, not least of which is that each steam box would have a different configuration, meaning that the uniform platform that game developers would get as far as consoles are concerned wouldn't exist. I expect steam machines to combine the worst defects of both PCs and consoles.

The driver for the next generation of consoles will most likely to be 4K TV. If you look at what would be acceptable for 4K gaming today, the anandtech analysis would be that no less than 4 Titan GTX video cards would be needed to drive a AAA 4K game at 60+fps. This is at max settings, however, and consoles do not need max settings to be good enough, so maybe 3 GTX video cards would be sufficient. Each of the GTX video cards is about $700 today, so 3 of them would be $2100. If we assume that the consumer part for these devices have to get down below $200 in order for them to be commercially viable, that would be four Moore's cycles to get to $132 for the compute equivalent of 3 GTX video cards.

The traditional Moore's cycle is 18 months, but in recent years, it's been more like 2-3 years, or 8-12 years. However, there are several factors. First of all, it might not be necessary to deliver games at the full 4K resolution. The 7th generation of game consoles only delivered 720p resolution to HDTVs, and it's entirely feasible that the 9th generation of consoles would deliver say, 2560x1440 resolution rather than "true" 4K. This would be particularly attractive for whoever lost the 8th generation console wars, since that vendor (currently Microsoft) would be more motivated to start the 9th generation wars earlier rather than allowing the 8th generation to be dominated by a major competitor. It's entirely feasible that Nintendo could do this as little as 4 years out, but given their recent statements, I do not expect them to try to compete on the basis of CPU or GPU horsepower with Sony or Microsoft. Secondly, it's quite possible that dedicated gaming hardware that has software written close to the metal will outperform Anandtech's benchmarks. Put all this together and I expect the next generation of consoles to be deliver within 5 years, rather than the 8 years between the PS3 and PS4. This is made much more likely now that both consoles are on the x86 architecture rather than custom hardware, enabling more frequent updates.

I have to say that I'm fairly excited about 4K, though I disagree that 4K streaming is necessarily a good thing. HD streaming already looks much worse than Blu Ray to my eyes, so my guess is that it would take a higher bandwidth delivery format than Blu Ray for 4K to truly take off.

Tuesday, March 04, 2014

Are you surprised by the success of the game consoles?

If you read reviews in general interest newspapers or magazines like the New York Times, you should be surprised to learn that the PS4 has been out of stock nearly everywhere. This is surprising since not only do the consoles face competition from tablets and PCs, but also a range of devices ranging from Roku to Chromecast. One would think that those cheaper devices would out-rank expensive and "difficult to use" game consoles for Netflix streaming, but NPD reports that 39% of Netflix users who watch movies on a TV watch it on a game console! This figure is even more amazing when you consider that only 56% of US households have a game console, meaning that those game console owners are much more loyal to their devices than Roku or Chromecast owners. By the way, this unexpected use of the game console as a media device explains why the Wii U had such low adoption rates: the Wii U cannot double as a blue ray player, nor can it play movies from an external device.

The mystery becomes much less of one when you consider recent computer ownership trends. Most laptops simply do not have the GPU capable of running modern games. For instance, the highest end Apple Macbooks and iMacs have an Nvidia GT 750M inside them. This is a 722 GFlop GPU with 384 stream units. That sounds very impressive, until you consider the PS3 and PS4 each have more than 2 TFlops of performance, with the PS4 running 1152 shader units. But that's not all, the high end Apple machines are driving 2880x1800 displays or 2650x1440 displays, while the PS3 is only driving 720p output and the PS4 is only going to drive 1080p output. In other words, the game consoles have more than 3X the GPU power but are driving 1/4 the pixels of the laptops.

Tablets are even worse, as they frequently have the same resolution as the above, but have to be optimized for battery life of 10 hours or more. The iPad Air has 76 GFlops (1/10th the power of the high end Mac), while driving the same number of pixels. For reference, the PS Vita has 38.4 GFlops, but is driving a display that's 1/6th the resolution of the iPad Air, so it's got 3X the equivalent power of the iPad in terms of pushing pixels around. Note that the Vita has a battery life of 3-5 hours, as opposed to the 10 hours that you would expect from an iPad.

Gaming PCs are a different story, since they don't have to run at low power, but if you look at a typical PS4-alike PC, not only do you not manage to hit the $400 price point and end up with a much larger case and lower memory bandwidth than the PS4, you also run out of budget to buy a blu-ray drive or controller. You could buy an Alienware PC, but now you're looking at a budget well over $400, and you're locked out of Sony's exclusive games for the PS4, which judging from the track record on the PS3, would be a fairly substantial loss. This explains why my PC gamer friends were disappointed with the PS4 and XBox One announcements, while the market has proven that those consoles are selling very well. The typical PC gamer will have a $1,000+ PC that will outperform any of the consoles, but will also be an increasingly small percentage of the population compared to the number of folks toting Macbook Airs.

The problem with PC gaming recently has been the focus on lower power rather than higher performance. Intel has simply chosen to use its real estate on the chips to increase performance/watt by delivering more cores rather than deliver more GPU. While most consumers couldn't care less about reduced power consumption on their machines (most PC users still turn on and off their machines, and don't run their machines at full capacity often enough for the power bills to matter),
Intel's primary customers for high end processors are the companies running high end data centers like Google and Facebook, rather than the individual consumer looking for maximum single-threaded performance. In addition, it's hard enough for you to get others in your household to use the controller for movie streaming, let alone a mouse and keyboard that's required to manipulate a PC UI.

The net result of this set of trends is that unlike many other pundits, not only do I not think that console gaming is dead, I expect to see a console-like device in about 50% of homes for the foreseeable future. There will be a 9th generation of console, and beyond. The typical household doesn't consider PCs/Laptops anything other than work devices, and will continue to buy separate game consoles both for streaming video and for playing high-end games.

Thursday, January 30, 2014

Review: Batman Arkham City

Arkham Asylum was the first game I ever finished on the PS3, and Arkham City came with extremely high reviews, with many critics saying that it was even better. Unfortunately, by the time I got around to starting Arkham City, Bowen was born and I was too busy to play. A recent upgrade of a video card and a Humble Bundle sale meant that I could continue on my PC with its 1440p screen and all the glory thereof, albeit several years later.

First of all, to properly play this game on the PC pretty much requires an XBox 360 Controller. For a while, I used my PS3 controller using DSTool, but then one day DSTool went and took over my Logitech Wireless receiver, which was not cool. And yes, I did try a cheap Logitech knock-off controller, but it was unsatisfying and imprecise, so I can't recommend any of the alternatives to giving Microsoft money. At least the Microsoft wired controller provides vibration control, so you do get something.

Once you've got everything set up, the game plays well, and it's a beautiful game, provided you like night-time or indoor spaces. If I'd never played Uncharted before, I would say that this was a great game, but having compared to the best of the story-based games, I have to say that Arkham City has several flaws, even when compared to Arkham Asylum.

First of all, the game is a bit too open, and tries to bombard you with information overload the entire time. This is par for the course if you're Batman, of course, but seriously, we already get plenty of information overload during our daily lives, I'd think that an escapist video game wouldn't need it. You could barrel along at full speed through the main story and ignore all the side missions, but that's not how the game was designed, so you'll find yourself ignoring the ticking clock the story points you at and do a few, just to get enough experience points to upgrade Batman a few times.

I played the game on easy, but even then, in some places there were just a few frustrating places where you felt that the controls just didn't work. The remote-controlled batarang, for instance, were an exercise in frustration with keyboard and mouse (I bought the controller to get over this section). Even with the controller, it's not perfect, and just barely workable. The fight with Joker and his underlings was a major difficulty spike: none of the rest of the game is nearly as difficult or challenging. It took me about 6 tries to get past that.

The story, of course, is incredibly well written. The writers pulled out all stops and didn't balk at eliminating the possibility of a sequel (though, as we find out a few years later, the franchise tries to get around the problem by providing a prequel). I think this is easily one of the best Batman stories ever told, in any medium, and the fact that it ties in with the Arkham Asylum very well as a direct sequel is a strength. The penalty, however, is that it's overly long, and uses way more of Batman's portfolio of villains than you can keep track of. I clocked about 17 hours finishing this game, and that's without trying to do all the side missions or the additional DLC.

The game also enables you to play as Catwoman several times during the story, and she plays differently enough from Batman that it's a fun adventure in its own right, though relatively short and fairly easy. It's a good relief from all the craziness that Batman usually gets.

Having said all that, the game does a great job of making you feel like Batman. The controls are fast and responsive, and whenever you work around the room or environment picking off one goon and intimidating the rest, you get a thrill of what it means to be Gotham's greatest detective.

Overall, I recommend this game, but I think it's a bit too hard core for a general audience. If you're not a hard core gamer, you might discover halfway through that you've bit off quite a bit more than you expected. Arkham Asylum's quite a bit better in that regard, so if you haven't played that, do that game first. Both games are relatively old now, so are cheap to get.

Friday, December 06, 2013

Review: Amazon Playstation Vita Holiday Bundle

It's not a secret that in recent years, the iPad and other tablets have become the most ubiquitous video game platform. However, the games for the platform are pretty sad. Much of this has to do with the touch interface: there are precious few games that can do it justice.  So far, Space Station Frontier and Strikefleet Omega are the only two games that I think use the touch interface for maximum effect and fun game play. Most of the other games are either marred by extensive in-app purchasing business models or just absolutely terrible UI. For instance, The Dark Knight Rises uses a touch-based joystick and it's pathetically unplayable. Even Clash of Heroes, which is a port from the Nintendo DS suffers from playability problems due to the touch interface.  One would think that the tower-defense genre would be perfect for the touch interface, but even then, none of the games I've tried come close to Defense Grid.

The net result is that for an immersive experience, I have to play on a PC or game console. Yet both of these are relatively heavyweight operations. For instance, the PS3 lives in the living room, and the gorgeous experience does mean that it takes up everyone's attention while it's on. The PC obviously requires boot up time, and while it provides a nice interface, it's still a little less satisfying than the dedicated game consoles. And the biggest defect is that when on the PC, I feel guilty about doing anything other than working on the next book!

Amazon has an exclusive holiday bundle for the PS Vita bundled with a small memory card and 3 games, so I picked it up. One reason to pick up a PS Vita this year as opposed to waiting for yet another price drop is that next year, the Vita will lose its OLED screen and switch to an LCD screen instead.

After unboxing the unit, I downloaded Uncharted Golden Abyss, and started playing. It's a beautiful experience. The screen disappears from the unit, and the controls are responsive and quick. While not a full-screen movie experience Drake's Deception is on the PS3, it's was for me an incredible hook and motivated me to download Drake's Deception on the living room console after getting started on Golden Abyss. If you're used to tablet games, the games on the PS Vita are a revelation. For one thing, there's no in-app purchasing, in-game advertising, cross-promotions for other products, etc. You paid for the game, you get to play. It's great. And unlike even PS3 games, there's no constant need to download patches, new content, etc. (I was shocked that Drake's Deception after a 40GB download, required another 500MB patch to start) The interface is intuitive, though some are gimmicky (in particular, melee combat using the touch screen is as pathetic as it would be on a tablet).

One of the best deals in gaming is the Playstation Plus membership. If you get in on the holiday deals, this is about $30/year for effectively 2-3 free games every month on all your platforms (PS 3, PS 4 -- currently out of stock online, PS Vita).  During the holiday season, Sony's promising a free game a week. For a  light gamer, this is more games than you'll use, and is an outstanding deal, even if many of the games would not be to your taste.

The PS Vita can also act as a remote gaming console for your PS3. I tried this on Shadow of the Colossus, and it's an impressive piece of work, streaming audio/video to the Vita from the PS3. There aren't that many games that support PS Vita remote play for the PS 3, but if you do upgrade to the PS 4, the story is that Sony has required that all games on that platform support the Vita.

The best way for me to describe the Vita is that it's like a Kindle for video games. Video games are a lot like books in that the time it takes to finish reading one is substantial, and for adults with children, the only way to consume them is in sips rather than binges. In that sense, an instant-on, always available device that lets you get in a few minutes here and there makes it well worth the entry price. Recommended.

Saturday, November 30, 2013

First Impressions: Dell Venue 8 Pro

I gave my wife a Dell Venue 8 Pro for her birthday. Last year, she got a Nexus 10, but she found that the 10" form factor led to it not getting used much, and Bowen ended up being the primary user of the device instead. As a result, she asked for a smaller device. While I've been pleased with the Nexus 7, all the indications are that the latest Intel Bay Trail processors run circles around their ARM equivalents, so I took a leap of faith and bought the Dell when it became available for $230. If you shop around on Black Friday I'm sure you can find similar deals for this device.

The best thing about the Dell is that it's a Windows box.  That means, for instance, that I can back it up using the Windows Home Server that I use for backing up other PCs. This is a great feature, since the default Nexus/Google tablet backup doesn't seem to back up as much as it ought to. (Switching devices, for instance, usually means a massive reinstall of Amazon stuff)

Unlike Google, Dell doesn't suffer from as much Apple envy. As a result we get a microSD card slot. What this means is that even though I bought a 32GB device, an additional 32GB costs only $18, with the prospect of very cheap upgrades to 64GB and 128GB storage options in the future. This type of future proofing means that the additional $5 premium over the latest generation Nexus 7 is well worth it.

On top of that, the Dell device comes with a full license of Microsoft Office, and can run any of the applications that a regular Windows machine can. That means the contortions I used to get the Nexus 7 into a partially decent photo-editing tool are obsolete: I can just run Lightroom! Or Picasa. None of those have Android equivalents, and it truly is amazing to have them available in this form factor.

The device is fast! After using it as a web-browser during Black Friday to do some online shopping, the Nexus 7 feels slow. What this means is that while buying from Amazon is straightforward on the Nexus 7, any other site (such as Newegg) is agonizing. The Dell, however, feels just like my desktop. Everything behaves like it should, and pages load fast and snappily.

What's more, Windows 8.1 really does make use of all the power of the device. For instance, I wife discovered she could run two "metro" apps side by side. That's a power-user feature, but she found out about it quite naturally by simply playing with the device. And of course, Windows has had multi-user support for ages, so everything works like you expect. For instance, installing Chrome for one user automatically installs it for all  users as long as you use administrator privileges to do so.

There are cons to the device: Windows 8.1 is not quite fully adapted to the world of touch devices. So for instance, at times, the on-screen keyboard will block the input field that you're entering into. That's annoying. Unlocking requires typing in a password instead of using gestures or a pin. That's potentially a security hole if you use it on a plane and share the same password as on your regular desktop, which Microsoft encourages because you're supposed to login using your hotmail account.

At 1280x800, the Venue 8 Pro has pretty much the same resolution as last year's Nexus 7, instead of the new 1920x1200 that the Nexus 7 spots. The reality though is that if you need to pop to the desktop for any amount of time, the lower resolution means your fingers aren't as fat. So that's not as big an issue as you might imagine.

And of course, there's not the rich ecosystem of "apps" on the Windows 8 store. But who cares? Since you're running full Windows with a real web-browser, applications like Amazon Instant Video which aren't available on the Google Play store can simply be run on the device like a native PC. Anybody who faults this device for the lack of apps simply doesn't understand that it's a real PC, not a tablet.

All in all, this device is the tablet that Microsoft should have launched last year, instead of foisting Surface RT on us. It's as cheap as an Android tablet, but has much much more functionality. Highly recommended.

Saturday, October 27, 2012

First Impressions: Windows 8 Pro

I'm not normally an early adopter, preferring to upgrade things after people have already eaten the beta. But Microsoft made all Windows 7 users a great offer by offering upgrades for $40.00 online, so I couldn't resist and upgraded my personal desktop to Windows 8, confident that I could deal with whatever usability snafus Microsoft could throw at me. After all, I consider EMACS a good user interface.

The purchase UI was terrible, not because it was difficult to fill out, but because Microsoft's servers were overloaded and rejected my attempts to buy the upgrade 4 times before I finally succeeded. Where's a decent monopoly when you need one?

The upgrade process was fairly straightforward. Burn the downloaded .ISO to DVD, and then run the setup file directly from inside Windows 7 (if you want a fresh start, reboot and boot from DVD). The upgrade assistant does a good job of telling you what will or will not work on your upgrade, and which tools you'd have to reinstall. It even reminded me to deauthorize my iTunes account and reauthorize it when the machine refreshed.

People with SSDs will tell you that the upgrade takes all of 18 minutes. I have a hybrid SSD, which is of no help whatsoever when it comes to OS upgrade, so I started it off and had lunch and came back to find the setup screen waiting for me.

There are several nice features waiting for you in Metro. Unfortunately, I could not use most of them because I discovered that if I linked my Windows account to my Windows Live account (which is the way Microsoft intended to use it), then my mapped network drive to Windows Home Server stopped working. Doh! Fortunately, I don't really care about sharing from the new suite of Metro Apps, so I cheerfully reverted back to my local Windows setup and my mapped network drive all worked.

The strongest point of Windows has always been backwards compatibility. Even so, a few things didn't work. I ended up uninstalling ATI's Catalyst software for managing the screens. This is not a problem, because Windows actually manages all that without issues. I had a few horrifying moments when I thought that SleepyHead stopped working. I would have happily uninstalled Windows 8 Pro and reverted to Windows 7 if this was permanent. But after swapping around some display settings and turning on the Windows 7 compatibility checkbox, SleepyHead was working, perhaps even better than usual because it stopped trying to foist those annoying updates at me.

In fact, one thing that I've noticed is that my monitors now enter sleep mode and resume from sleep mode with a stability that Windows 7 never had. It could simply be that I should have uninstalled ATI's Catalyst a long time ago, rather than using it.

The coolest thing about Windows 8 Pro? The new performance monitor. Not only does it tell me what it used to tell me, it also gives me what GHz rating I'm running at, whether TurboBoost was working, and it just looked cleaner. Nice.

Windows Home Server recognized Windows 8 (despite predating it by almost a decade), and proceeded to backup my machine just fine. That's pretty nice. I do miss the old start menu, but learning to hit the Windows Key to bring up the Metro UI didn't take much. I played around with some apps on the new UI, and to be honest, I'm not sure why I'd want a tablet-style interface on my desktop. It's just too annoying to only have one thing on screen at a time. Nevertheless, if you have 2 monitors, you can designate one for the new UI, and have the other for real work. So that doesn't suck at all. Windows-PageUp and Windows-PageDown also lets you flip the Metro UI from one window to another dynamically, and it's pretty snappy. The new Mail/Calendar app are pretty worthless, since as far as I can tell, it's not faster than Thunderbird or any other mail client, and seems more limited. The news reader apps are beautiful, but again, do I really want to use my beautiful 27" screen to host one story at a time, or would I rather have my tabbed interface on Chrome and flip between multiple stories? No contest, the traditional UI won. It is nice, however, to have games running in full screen, and some of those Metro-style games are pretty fun even with a mouse. Also, it's nice to have Google Search with voice recognition, the way it works on my tablet. Google should have integrated that into Chrome ages ago, rather than waiting for Microsoft to force them into it. The Metro Kindle App is also a thnig of beauty. But again, I didn't buy a 27" monitor to use it as a book reader.

Windows 8 was touted as having high performance boot.I tested several reboots with a stop watch, and discovered to my horror and disappointment that boot speed was abysmal, 1:45s for a machine that used to take 1:00. I re-read the linked blog post and discovered that I was doing it wrong: hitting the restart button would cause a cold shutdown and cold boot. However, a standard shutdown forward by a boot would yield much faster results, 0:45s to a usable desktop, or about half the time, which is quite a bit of a performance gain. Do this a few time so the hybrid SSD would learn the OS usage pattern, and I'd expect the boot performance to be even better.

Do I recommend Windows 8? It depends. To me, it feels like the new OS is just a bit snappier on certain tasks --- flipping between applications seem just faster. IWhat I like is the new increased stability of sleep mode for me. Sleep didn't use to be so fast and reliable. I like the faster boot speed as well. I do miss the old start button UI, but I haven't found adapting to the new UI to be onerous or horribly confusing. It's a bit jarring though: you do feel like you're flipping modes.

All in all, for $40, I feel like it's a decent deal. I wouldn't pay full price for the upgrade, especially if you don't have touch on your desktop. I wouldn't go out of my way to do an upgrade (I did, but I was also curious about the new UI), but it's nothing to run away from either. I certainly don't see it as the disaster some pundits have been saying. Mildly recommended.

Monday, September 24, 2012

First Impressions: Google/Asus Nexus 7

I'm writing this review late, since there are plenty of Nexus 7 reviews out there. However, I did get the Nexus 7 as a birthday present recently, so that's my excuse. Why the Nexus 7, instead of say, one of the latest flock of Kindle Fires?

I could complain about my frustration with not having access to the native GMail App, and how I dislike the forking of Android, even though I understand the business reasons behind them. However, by far the most annoying one is that the latest Kindle Fires simply do not have GPS! Now, you might think that the lack of a GPS shouldn't matter to a device that doesn't have always-on connectivity, but first, Google Maps recently offered an off-line capability (though one that's not quite completely useful --- for instance, navigation absolutely does not work when off-line). Secondly, Frank Spychalski pointed me at this article about using the Nexus 7 for outdoors, and it looked quite usable: you do have to spend $15 for U.S. Topo maps (which is easily paid for by the $25 Google Play credit), but it's a much better screen than say, the Garmin Edge 800, and a better deal than the Garmin Topo U.S. at $60.

By the way, I spoke to a Kindle designer on my recent Birthday Trip and he assured me that the next iterations of the Kindle Fires will have GPS. So what about the device proper? My brothers splurged and got me the 16GB Nexus 7, so the first thing I did after charging it was to login and start loading up all the apps I had deleted from my phone ages ago due to the N1's meager 256MB of internal storage. It's interesting to see which applications makes a difference versus just using the plain old web-browser: apps like Quora, for instance, are surprisingly useful because the web-site is mis-designed for a smaller device. Apps like Delicious, for instance, are required otherwise the other apps wouldn't know how to share to delicious, not because anybody sane would want to use the delicious apps. By far the most sophisticated apps are games. The big screen, high definition display, and touch screen and tilt device makes the games great. The battery life was also decent: I could run MyTracks for 2 hours and change, and still run the machine intensively for the rest of the day without draining the battery. That was a surprise. Outdoors, the screen was usable, though not as bright as I would like it to be.

The speakers are the weakest part of the Nexus 7. They sound pretty terrible. Fortunately, the headphone jack works without fuss.

I paired the Logitech PS3 Media Board to the device (it was what I had lying around, and I'm not about to buy a new keyboard just for the device), and it worked great. My typing speed is as fast as on a real computer, and I couldn't out-type the machine. What's even more impressive was that the touchpad worked! That was unexpected and as a result I can IM as quickly from the N7 as I do from the desktop --- my friends couldn't tell the difference. Despite all that, I still found myself returning to my desktop machine for blogging, and I still refrained from reading important articles on the Google reader app. The truth is, if you're a photographer, you still end up booting your PC to read photos off an SD card, and I can't imagine preferring a 7" screen to a 27" display for serious writing. However, what I do see myself doing is using this to do a quick check of e-mail without booting up my power hungry PC in the morning, checking my Calendar, and so-forth. Also, I had been contemplating buying another laptop for travel purposes so that XiaoQin and I could each have a laptop, and I could see this eliminating the need to carry another laptop. Of course, carrying the Logitech keyboard is not ideal (I'm certainly not about to carry it onto the plane), but on the plane, I expect to just watch movies on it.

Ultimately, if the Nexus 7 died tomorrow would I run off and buy another one? Probably not. It's still not as good a fit for my life as say, the Kindle Keyboard in combination with a smartphone. In summary, I recommend the Nexus 7 over say, the Kindle Fires or the iPads.

Thursday, December 20, 2007

The Witcher Review

Over the last two weeks, my leisure time has been pretty much sucked up by this completely engrossing game: The Witcher.

The Witcher is a RPG set in a fairly typical Tolkein'esque fantasy setting. You have your dwarves, your elves, your dryads, your monsters (too many to list, but suffice to say you won't be whining that its simply a world full of short humans and point eared humans running around).

The game has you playing as the Titular hero, "The Witcher". Which in this world, refers to a special group of Monster Slayers. Far be it that they are normal, Witchers are a special breed in that they not only have mastered swordplay, but have also mastered sorcery, and also have special potions and other stuff that has mutated them far beyond that of a normal human. There are also normal human Monster Slayers in the game, but the game does a great job of explaining the difference between a normal human who happens to slay monsters and a witcher whose primary job is to slay monsters.

So far pretty rote right? Nothing special so far. Some might even be turned off that you don't get your choice in how you wish your character to be created...what if you don't want to be a sword wielding monster slayer, how about ranged weaponary? why swords? Well, the game doesn't give you any choice at all. You even have a name assigned to your Character....so those hoping for a Oblivon'esque experience have to be prepared to take this compromise or be prepared to skip one of the best RPG experiences in the last 5 years.

As mentioned previously, the game doesn't give you a choice in who to play...you have to play Geralt, who apparently is a famous Witcher in his own right. The game gets around the "this guy is so buff he doesn't need to level up" problem with playing a legend by giving him amnesia, so even though he knows how to use a sword, he's not great with it (yet). Although a bit of a cliched mechanic (how many more amnesiacs do we need to play??), it is very well handled through the great and mature writing.

Which brings us to what makes The Witcher a great game. The writing. The world that the Witcher takes place in is a very gritty world. Think Glen Cook's Black Company for a similar comparison. In this game, folks dying is just the beginning of the exploration of the mature themes. There is rape, torture, incest, and, well, Love. And that's just the tip of the iceberg! The developers however, cannot be credited with such a strong world, as most of the world buildling was done by the author of the Witcher series. Never heard of the Witcher series? Well, you're not the only one as it was previously a fantasy series sold only in Poland. An English translation of one of his novels is coming over the pond however, and it should be an interesting read.

What the developers can take credit for is how well they enhance the RPG experience. In most RPG experiences, you have your choice of being good, or being bad, with very little grey in between. When you're good, you're good as the purest snow, and when you're bad, you're like the lovechild of Hitler and Stalin.

The Witcher doesn't let you off so easily. There are moral ambiguities in almost every decision you make, and there is no right or wrong answer. Each answer can be justified one way or the other, and surprisingly enough, they even give you a choice of remaining neutral! With a character as powerful as Geralt, the game portrays what happens to neutral characters very realistically, and it is rather satisfying to see such thought put into the game story.

Beyond the whole moral "good", "bad", "netural" landscape of decisions to make, there are also other in-game decisions you make that affects how the game plays out. A decision you make in Act 1 (there are 5 acts, and 1 prolouge and 1 epilogue), can come back and affect you in the later acts. This is incredibly masterful in that it prevents the usual "save & load" syndrome that many RPGs have. By making decisions you make have an immediate payoff, and later payload unknown to you, it makes almost every decision you make seem weightier than ever! The best part is it really does make the world seem like a living world. A character you save in the beginning can come back to help you, or come back to haunt you (literally)....a decision to help a certain faction can deny you some side quests later...these decisions are never game breaking (they don't break the main quest), but it adds so much more flavor and details to the game that its hard to ignore.

A special mention needs to be made here for the story, the story is actually fairly simple, but the amount of details put into it is what makes it incredible. Add to it the choices you can make that customizes the game towards you, and it makes the story a rather rich multi-layered affair. Its fairly similar to something like Lord of the Rings (get rid of the ring) in that it is a fairly simple story, but the amount of details and layers added to it turns it into a classic. The Witcher is much the same way. Don't expect all threads to be revealed either, there's at least two pieces of ambiguity that never gets resolved at the end of the game, and its not really so much for "lets have a sequel" effect, as it is a "real life rarely reveals all either" effect.

There's also an option for the players to engage gratuitously in sex, and while some might object to it, its something easily skippable. Its also not explicit and non-interactive, for those really prudish about what they like to see in a game. I see it as an almost necessary feature in a RPG which claims to be gritty however.

The combat system is also fairly...revolutionary for an RPG. The game engine is a heavily modified NWN engine, and the combat is much the same: real time clickfest. Instead of just rapidly clicking until your enemy is dead, there's a timing aspect of it. You have to click at the appropriate moment within the animation to make the character combo into his next move. While it sounds like yet another simon says game, the timing is actually fairly complex and on the higher difficult levels, the window for the click is reduced, and there is no visual aid (on easy and normal, there's a flaming sword icon that replaces your mouse telling you "this is the time to click!").

The game graphically is gorgeous. Easily one of the prettiest games I've played, and I'll say the artwork rates higher than that of even previous heavyweights like Oblivion. Combat is a rather beautiful affair as long as you only look at what your character is doing...Geralt is very much a work of art when it involves swinging his sword, and even though it makes no sense why he has two styles of fighting with two very similar weapons, its mostly to give the players more eye-candy and more character customization options (max out silver sword? or steel sword? max out magic?).

The cutscenes in the game are mostly rendered using the game engine and I have to say I have never seen in game engine cut scenes rendered as beautifully in an RPG before. You'll swear the cut scenes are basically movies....until you get to the movies, and there its even better. They really did capture the way Geralt moves and fight. The scenery within the game are also spectaular, ranging from the complex city, a dreary swamp, to a very cheery village. Locales are varied and dungeons are kept to a minimum and only in sensible locations.

So...what are the shortcomings of a game like this? The first is that load times can be rather long, though the newest patch promises to solve this issue. The other is that you can't ever really be evil. Even when you side with the wrong faction, the game makes sure you realize the mistake down the road and corrects it for you. Though the correction makes sense, it can still grate on those who truly revel in what evils their in-game character can create. Given that this is a game about the titular character, it should be expected, but I can still see it as a problem. Oh yes, there are also numerous Crash-to-desktop bugs as well, but once again, all these problems appear to have been patched away (I didn't patch for fear that I'll have to restart the game over!).

The game is also long, 50 hours or so. Some might see it as a shortcoming, some might think its not long enough. =)

So..in summary, one of the best games I've played this year, ranking up there with Stalker, and Portal. For an RPG to engross me this readily is no longer an easy task (I stopped playing NWN2 and Oblivion within 10 hours of each), and I can heartily recommend this game with no second thoughts.