Indie games don't count. They don't sell units because, to be frank, they aren't that good.
This is categorically false. Indie (in this case I'm using it to refer to games development not backed by AAA studios) have long been successful and responsible for innovation in the PC gaming world. Go back to the days of shareware in the 1990's and companies like Apogee and id that helped create the most popular video game genre (first person shooters).
In modern times it's not uncommon for the more popular indie games to sell into the millions of units. And there are dozens if not hundreds of high quality indie titles to be had (I could start rattling off titles if you'd like). You even have cases like modern indie company Mojang getting sold to Microsoft for $2.5 billion dollars. Or a game like Star Citizen becoming the most successful crowd-funding campaign in history at over $100 million dollars.
So no, the idea that indie titles either don't sell or are poor quality is a complete myth.
Without AAA studio support on a massive level any optional tech is doomed to fail.
Not really true. There are niche peripherals that find a niche, occupy it and survive. Look no further than PC flightsticks or driving wheels + pedals.
I'm honestly unsure what you mean about tablets...
I meant that people said they were too expensive and would never catch on. And then people bought them and they caught on.
Sound cards and video cards have never been optional in the entire history of modern computing. Sure back in the pre-486 days.... put pcs weren't pcs until post- win95 and the invention of directx as a common hardware interface for programmers.
PCs were PCs since 1981: the release of the IBM PC (and subsequent clones). A lot of PC hardware we know consider standard started out as optional: sound cards, modems, network cards, 3D-accelerated graphics chips, heck, even mice.
Later they became standard, but it was a number of years before that happened.
You'd be surprised to find that the vast majority of computers do NOT have add-on sound and video cards.
But this wasn't always the case and that was my whole point. What is now standard began as optional.
Even ignoring that, that's the pc.... the hardware platform and all of it's components have nothing to do with the I/O, which is what I'm referring to. Not a single solitary optional accessory has ever caught on....
from the NES zapper to psmove and even more mundane ones like the genesis 6 button controller. There were ~15 games made for the zapper over the nes's 10 year run. The genesis 6 button controller was virtually un-used in the entire genesis library.... the handful of fighters excluded of course. As for the move... forget about it. Meanwhile the wiimote, which is basically a psmove, was used in virtually every game in the Wii's library. Why? Because it was mandatory. The only thing that's ever came close is the 360 gamepad on the pc... that thing has certainly become fairly mainstream, but that's mostly because Microsoft released a library that allowed a programmer to read and write to it with a single line of code.
I think a lot of attempts a peripherals in the console world were either limited by design (i.e. light guns), the technology was awful (i.e. a lot of motion technology), or they were solutions to problems that didn't exist.
One difference at least for VR is it's not strictly an input peripheral, as most are. It's a legitimate new way of experiencing a game world. It's probably the first technology in a long time that can make a case for actually increasing immersion in video games. For all of the advances in video game tech, we've been limited to 2D screens for a long time. But to now have the opportunity to feel like you're in a different world with things represented in the proper scope and scale; this has the possibility to be revolutionary.
Just look what
dmckean wrote a little earlier:
"He put on this zombie game and the zombies are right in front of you and they're 6 foot tall, it's scary and insane."Remember, games are made to make money. This means by default developers, as a whole at least, do the least amount of work to get maximum profits. That means they don't waste time (which equates to money) adding support for a controller or what have you unless virtually the entire consumer base has the accessory. Sure some studios will take a chance at a game or two that use the new device and even more will add in half-assed, un-optimized support, but that's about it.
True, but this is where the modern support structures exist that haven't in the past. The plethora of indie developers and crowd-funded projects allows for risk-tasking that most big publishers shy away from. So there is an opportunity for the foundation to be built before the bigger studios come on board. And don't forget that the current round of VR are being backed by some large companies (Valve, Sony, Facebook, HTC, Samsung) to begin with.