No, this is not a long overdue explanation for this blog. It's about an irritating article by Marsh Davies on Eurogamer. He thinks we, consumers, have no right to opinions.
He was apparently annoyed (like many assholes on the internet I noticed) by the recent campaign to play HL2 and get Valve's attention, with the hope that they will maybe release Episode 3 in our fucking lifetime.
Look, there's an unspoken contract between players and developers. It's not just "lol they not buye your're gayem xD". The devs are investing money into something for years, and for all they know, it may turn out to be a complete waste because no one pays for the game. The players are investing money into a product, and it may turn out to be a complete waste because the thing is unplayable tripe. Either side is taking a leap of faith. Either side of the relationship calls for lots of trust.
Now, when you are lucky enough as a developer to build a strong fanbase which likes the thing you do, there is one very important thing which you must absolutely not fuck up: Keep making those games. Those guys love you, because they love the games you made. They'll buy whatever else you make, because they expect more like them. This doesn't mean you literally have to make clones of the same game- but do fucking learn lessons, and don't forget the things that made your games good. If you were prized for originality, don't start making uninspired shovelware! If you were prized for complexity and respect for the player's intelligence, don't start making idiotic, casualized games that betray deep contempt for the player.
What if you don't? Your fanbase might leave you. You reputation will be hurt- many will buy your games expecting, rightfully, something of similar qualities to what you've made before. They will, again rightfully, feel cheated. And then, when your audience likes one kind of game, and instead you make another, there's just nothing to keep them interested anymore!
Sure, you may get a new fanbase, which likes the games you make "now". You better damn well hope that you do, because the one that liked you for the games you made "then", sure as hell won't be pleased.
With regard to Valve: They should be happy the players whine about Episode 3. How often is it that a business has consumers tell them, "just make us this product, and we'll happily throw lots of money at you"? No lengthy design process, no uncertainty, no agonizing market research. It's a million dollar idea, delivered right to your inbox, just waiting to be capitalized on! Valve is free to say, "thanks, but we don't feel like making a product guaranteed to turn huge profits and enjoy great success", of course. Their loss. But I can't fathom how anyone would fault fans for saying, "if you made a game like this, we'd totally buy it, just saying". For fuck's sake, they're doing Valve a favor!
The Eve Online reference? Entirely out of place. MMOs are a continuing service. If I'm paying you money for a service on a continuous basis, you damn well have an obligation to render me that service. The player complaints were perfectly justified- the company decided to start being obnoxious assholes and charge outrageous prices for cosmetic items, the players said, "hey, you better stop trying to be obnoxious, or you'll lose us, your customers". They listened, the customers didn't leave. Happy ending!
DDoS attacks? Sure, they're a bit of a stupid way to voice your enthusiasm and support for a developer's work. But that said, it wasn't a case of them "not being seen as working hard enough", as Marsh so grossly understates. First off, Notch was literally taking more than every other day off on vacations, and he'd then go and gloat about it on his blog. It's not that people though he could work harder- he barely worked at all. Second, while you may question the wisdom of such payments, the purchases of early Minecraft were not as is. Perhaps legally they were, but Notch heavily implied if not outright stated and the customers all understood that they were paying for ongoing future development of the game. And then Notch decided he didn't care about that ongoing future development. Buyers were not reimbursed. You honestly think Notch was in the right? Sure, they were gullible, and they got scammed. But saying Notch had every right to be lazy is blaming the victim no matter what you say, and you'll only fool clueless chums who weren't there to see the thing unfold throughout 2010.
I guess linking to poorly-written forum threads passes for investigative journalism nowadays, but again, what's wrong? The game is for sale at a price. The price is too high. The customers are saying, "if you made it lower, we would buy it". What's the problem? Would you prefer to sit there shaking your head, staring at tea leaves, trying to divine why they're not buying your game? There, they've come out and said why. Go do what needs to be done. What the hell more do you want?
About the Kotaku trash, little needs to be said. But, dear Marsh, pretending people who are joking are serious, and then chiding them for it, is not clever or mature, it makes you look painfully, embarrassingly stupid and out of touch. And second, obviously exceptional events are not a good way of proving a trend, and attempting to do this has a similar effect.
But in general, he is just so, so wrong. Yes, developers do owe me something. They owe me to live up to their end of an unspoken contract. When I buy their game simply because I like their previous game, not listening to critical word of mouth and negative reviews, they don't complain about it, do they? They don't bemoan their reputation that brought them all these customers, who decided to buy a game which, if made by a developer with a bad reputation, they might not have bought (or even heard about). In fact, developers don't complain at all! It's always these sycophantic commentators who feel the need to white knight for the supposedly hurt developers. Fuck you, white knights! I bought the game because I expected something good, like what the developer made before. If they're not delivering on that, I have every right to complain. If I go and tell them, "hey guys, I really like the game you made, please make more like it so I can buy that, too!", you have no right to tell me I'm being "entitled". Exercising your right of free speech, to offer business opportunities, to a company in need of them, is not being entitled. You're a cunt if you say it is.
Showing posts with label essays. Show all posts
Showing posts with label essays. Show all posts
13 February 2012
28 January 2012
On reviewing indie games, and the "indie game bias"
As you may be aware, the issue has come up in gaming press before of what exactly constitutes an "indie" game. My understanding is that, it's not a trivial matter, and, well, nobody really knows.
You can say that it's indie if it's self-published, which is the traditional definition (eg. for music). But unlike the music industry, there are huge game devs which self-publish, too. Valve, for one (and now with Origin, arguably EA as well). And then with online distribution, it gets really messy.
You also leave out companies like Paradox, which act as publishers quite often, but seem closely associated with the "indie game" thing despite that.
You can think about the budget, or studio size, but there are plenty of small developers who make decidedly non-indie, very mainstream games.
You can try define it in terms of ethos, but that's a downright titanic job. So is there a simple way of resolving this? I think so.
What is an indie game?
To me, for present purposes, none of this really matters. The only reason that we, as game consumers, even have a use for the indie/not-indie distinction, is that there is a certain class of "indie" developers who try to innovate and be original, and there is a pole of "mainstream" devs who just play it safe and take as little risk as possible, producing more of whatever is popular at the time.
There's the things associated with these indie games, too: They tend to be quirky, weird, they don't fit nicely into the traditional genres of gaming, they're difficult to describe quickly in terms of existing genres and conventions (unless you do something like "Game A meets Game B meets Game C! With dinosaurs and a leveling system!"). And this is why we, actual gamers, care: Because it's so common to just not want to play more of the same old, and because there's this pleasure of being exposed to an unfamiliar combination of narrative style, visuals, sound, gameplay mechanics and game structure, in other words an unfamiliar game. Remember the first time you played Portal? Yeah, kinda like that. (Incidentally, by this logic you could also reasonably rate Half-Life as not really that indie, since all it did was take a formula and refine and improve it greatly, as opposed to breaking any molds. It was, after all, a textbook FPS.)
Big, mainstream developers with managerial departments and shareholders can almost never afford to gamble by making games like this (or at least they never try), and independent developers often do. But just because nobody will distribute discs of your game, and you sell it from your own store at your website, doesn't mean you can't just make a clone. But it so happens (and if you think about it there are good reasons for it) that most people don't, and indie devs are more likely to produce these "indie" games.
So, I think "originality" is a good definition of "indie-ness" as far as a game consumer is concerned. In simpler language, it's good enough for reviews! Except... Well, originality is hard to define, and harder to measure. It's not very practical. It's hard to give games a score on originality, and do it right.
Well, there, I think I have come up with a clever idea: One other thing that seems to happen is, "indie" games are scarcely marketed, while soon after (and often before) release, a big mainstream game will be everywhere. When you have non-gamers asking about that game they have plastered all over the billboards, you know it's not an indie game you're dealing with. And if you don't advertise a cookie cutter title, who will possibly play it? (By the way, marketing departments are arguably the worst thing to happen to games media- they're the ones who pay for the reviewer bribes.)
So, why not use this as a heuristic: A game is "indie" (in the sense of being original and innovative; at this point we have abandoned any relation to distribution methods at all) if its marketing budget is big (either in relative or absolute terms, or both). Certainly, there is nothing stopping oh, Notch, from buying ad space left and right. But he doesn't. Nobody who produces indie games seems to.
Perhaps it's because indie devs are small, and can't afford it. Or perhaps uncertainty that comes with taking risks complicates return-on-investment calculations for the advertisement budget. Maybe the guys who like making original games just aren't good at marketing. Who knows? But in the end, it doesn't matter: I can't think of a counter example- an innovative game with an oversized marketing budget. And until the developers realize marketing exists, it seems like a decent enough criterion to use.
Note that, for my own convenience, I do not consider even the most blatant viral/stealth marketing to be"advertisement", nor are these part of my "marketing budget" as I use the term here - even though I imagine in reality the viral marketers would be paid by the marketing department. My reason is that, mainstream developers and publishers rarely seem to bother with large-scale (small-scale wouldn't matter) viral campaigns (if you don't count bribing or otherwise coercing reviewers), and when indie developers do, they are never large budget ones, so this doesn't interfere with our classification according to marketing budget. The other reason that it's really tough to decide where word-of-mouth ends and actual advertisement starts. What if you happen to know the dev, and you write a slightly more positive review because of that? What about if you trust and like him based on personal experience, and then say his upcoming game will probably be good? Not an impossible distinction, probably, but also not a very productive one.
There's an obvious exception to this: No one in their right mind would call either the post-EA acquisition Bioware an indie developer or Dragon Age 2 an indie game. But there was that scandal with Metacritic... So here, I'll apologize and take yet another cop out. You see, seeing as how blatant and obvious this was, I'll simply say that stealth marketing doesn't count as stealth unless it's kept well-concealed. You may say that kind of appeal to the consequences is nonsense, but suppose ElectronicVision Marketing decides to spend a trillion dollars on building a laser and burning an enormous ad on the moon for their Call of Honor: Modernfield 5. Suppose they then make a press release saying, "gee, thanks fans! Guess you thought our game was so good, you built a moon-writer laser just for saying so!". Would you really consider this stealth marketing, when nobody is being fooled?
Lastly, this is, after all, a heuristic. It's a rule of thumb without guarantee of absolute accuracy. It's a simple and easy to use one, which why you would want to use it at all. But for instance, if a hypothetical game were to exist with a minimal marketing budget, which clearly does not innovate in any smallest way, there's nothing stopping you from overruling that rule of thumb.
Who gets special treatment?
So now that we've properly identified our indie, and not-indie, games, we come to my original motivation for writing this: It's fairly common for game journalists, especially ones which are regarded as having more integrity (read: not known shill for games with gigantic marketing budgets), to be biased when reviewing indie games, and overlook flaws which they would not ignore in mainstream games. I won't dig for examples of this- it's an impression I am very confident about, and have had others similarly express confidence in. I will just hope you know what I'm talking about: the infamous indie game bias.
Now, assuming we agree so far, we can discuss the reasons for this.
Firstly, if one likes original games, as you and me and the supposed audience of this blog and the journalists who cater to people who care about integrity (read: intelligent adults) are wont to, one may simply let fondness for a game that supplied that much-sought originality to get in the way of being objective. I mean, sometimes you just like a game so much that you stop noticing its most obvious flaws: just ask the Dwarf Fortress players (ask me).
Second, it may just be that different is confused with better. The concept of (objectively non-superior) novelty by itself producing, temporary positive reactions is a well-known psychological phenomenon. And really, psychology aside, we're all familiar with the expression "until the novelty wears off".
Probably you could come up with a number of other, similar reasons besides the above two. But I'd like to skip those, and go straight to the one I consider most crucial: There are some big players in today's industry, and historically there usually have been. They're businesses, and they try their best to keep the competition down. The indie games, with their non-existent marketing budget, are at a disadvantage against the latest big release. A reviewer who thinks that indie games do more good for the industry and the medium than big name releases (a common sentiment) would be tempted to "level the playing field" by giving indie games an easier time.
Now, here's why I call it crucial, and what this post (or essay, if you're feeling generous) is really about: Would that be so wrong?
Before I continue, I'd also like to mention that aside from the above, there's also another complication: It's easy to not be corporate shill. I mean, you know when a game sucks and you are only giving it a good score due to conflict of interest. You could hypothetically have a game reviewer who was taken in by the ads on TV and fooled into writing a great review for a crap game, but we're discussing here people who write for an audience of intelligent adults. Thus, the reviewers themselves are assumed to be intelligent adults. Seeing as how they set out with the explicit aim of writing an objective critique of the product, I think it's safe to also assume that they will be more or less immune to the effects of marketing.
On the other hand, especially with my first two stated reasons, you are giving a game a better score because of subconscious bias. By definition, you may be doing it without noticing it, no matter how much you want to avoid it. There's scientific methods of dealing with this, but video game reviews are not science, and they are not even necessarily objective. So the best that can be done is to try really hard not to be biased.
The issue of those whom I quite callously call "shills" is doubtless a much bigger, serious and damaging one in today's video game journalism. But on the other hand, the indie game bias is insipid and much harder to deal with, both for the reviewers themselves and for the audience trying to decide if the reviewer is biased or not.
Leveling the playing field
To get back to the main topic: All the rest of the above aside, in the event that a reviewer is faced with the choice of whether to speak more highly of a game simply because it's an "indie" game (according to my above, bizarro, marketing-related definition), what should he do?
Well, I'm sure this is ultimately another complicated issue, and it's certainly not a trivial one to me. But some things are clear: You cannot simply praise every old indie game to high heaven, because you'll end up saying to people that a game is great when in fact it's crap, with arguably the only redeeming feature being that it has a novel gameplay mechanic -and nothing else- and doesn't even do that mechanic well. That's not nice. Besides, if games have or are to have artistic (or even intellectual in general) value, and be anything beyond simple escapism (which I and others believe they can be), we (that is both the reviewers and the game's target audience, and really if a game is to serve intellectual function then the whole audience should be considered critics) as the audience have a duty and obligation to inform the game creators of their shortcomings, so that they are able to improve. So I think there is no doubt that whether indie game bias is fundamentally bad or not, too much of it is bad for sure.
But then, if you just acted completely impartial, (if that is even possible, and I already made the point that it probably isn't), the big-names might crush many indie attempts and the end result is innovation being stifled in the industry in lieu of compensating for unoriginality with enormous marketing budgets. We definitely don't want more of that! That's already what we complain about!
So, here's my compromise: Be as biased as you like. Give the game crazy breaks for being indie. Overlook glaring flaws. But at the end of it all, acknowledge your own bias, and specify exactly how much of your praise is due to the game's merits, and how much is simply coming from the "indie game brownie points pool".
Of course you can't ever know perfectly well when you are being biased- but it seems if you try, you can still catch a lot of it. And if before you are about to misrepresent a game, you come right out and say "I am now going to misrepresent the game according to my bias", there's not much risk of anyone being misled or deceived... Unless they want to be.
You see, there is one last thing I'd like to bring up, and that is the, in retrospect obvious, observation that indie game bias is not exclusive to reviewers. It happens with consumers, too. If you just think about it for a moment, the underlying causes of reviewer indie game bias that I've talked about earlier are perfectly applicable to consumers. I can easily recall times when based on reviews, trailers, screenshots and what not, an indie game seemed to be crap and not something I would waste time and money on, but I decided to give it a shot simply because it's indie and I thought it deserves a break. Then there's all those games which you buy, and they are unplayable crap, but you're fine with it because you believe it has potential, and you want to support the developer and make sure they have a chance at realizing that potential!
Thus, disclosure of bias in a biased review also serves the function of reminding the reader that, while the game may not necessarily measure up to the same standards as mainstream games would be held up to, he should remember that it is an indie game, and that if he has a habit of embracing his consumer indie bias, now is a great time to lend it an ear. It's also a handy way of maximizing the longevity of a review- because a game which is remarkable for being original today will no longer be original, and thus as good (because we already consider originality a merit in and of itself), to someone reading the review years after the release - you can just subtract the bias and use that score.
I'd like to add, too, that this isn't simply about saying a game is better than it really is, simply because you like it for some subjective, very personal reason. Indie game bias is very relevant precisely because it's not personal: It's prevalent with reviewers and consumers, and it has rational basis (as I have hopefully shown above), being a moral imperative consequent to the dynamics of the market that the video game industry is dependent on.
27 January 2012
Spoilers, and why you should love them
There's this idea that you should avoid spoilers in reviews. Or rather, if you do spoil something, legions of drooling imbeciles assault you in staggering waves, foaming incessantly at the mouth with rage at your vile transgression. One might almost get the impression that spoilers are a bad thing!
Now if you've read even a bit of this blog, you'll know that I don't care if there's spoilers. I tend to make the token rejection of spoiler etiquette and just get on with my spoiling. Well, good news is, I've decided to actually explain myself! Bad news is that I never realized what an awful decision that is... But anyway, spoilers.
Spoilers usually means narrative spoilers. You can spoil other things. The chaps over at RPS think you can spoil mechanics. Personally, I find that silly. Car maintenance is a difficult, thankless job, with plenty of stress, dirt and physical labor. Calling these working class men spoiled is quaintly bourgeoisie... Oh, what? Oh! Oh. I see... But seriously, it's a legit class of spoiler they've found, and a nice concept. But in practice spoil-able mechanics are very rare, and ones which matter if spoiled are rarer still.
So what was I saying? Right. Spoilers? That means narrative spoilers.
And with regard to narrative spoilers, there are two kinds of stories: The whodunit, and everything else (let's call those "hedunnit"). What is a whodunit? It's a logic puzzle in narrative format. It's a story which is written such that the primary enjoyment is derived from trying to guess the ending. The name comes from, well, those books where you have some guy kill some other guy, and this guy tries to find out who's, well, done it.
Why is it special, the whodunit? Because there is only one reason to read the whodunit, and that reason is eliminated by a spoiler. I say read, but it can easily be a movie, or a video game, or any kind of narrative medium. It doesn't actually have to be a narrative, either. Any logic puzzle with a non-obvious solution is also, in a sense, a whodunit, because the point is to figure out the mystery, and once you know the mystery, it's no longer interesting.
That's my point in a nutshell; it has two parts:
- Spoilers are bad only if it's a whodunit
- Whodunits are crap (for this reason) and you shouldn't read them anyway
Hedunnits
By virtue of my definitions, it's already okay to spoil non-whodunits, or "hedunnits", as it were. Because a hedunnit is not written with the assumption that the mystery is enough of a draw, it will have other positive qualities. Perhaps the narrative style is unique and revolutionary. Perhaps the characters are fascinating. Perhaps the analysis of the events and moral dilemmas that come up is insightful. Perhaps the story is simply told in such a way that it's exciting even if you have heard it before.
You know, all those positive qualities that we are used to look for in real books (snap). The draw of any book is a combination of the mystery plot and the literary merits. You could theoretically have a book which is evenly split between the two: it would not be rendered worthless if spoiled but would lose significant power. In fact, I might say Tinker Tailor Soldier Spy was such a film. (I keep talking about books, but that is for convenience's sake; if you pay attention you will see this is applicable to works in any medium which tells a story) However, in practice, this doesn't happen often- partly because there's no reason to try to do it. You either write a whodunnit, in which case it makes no sense to bother with serious literary pretensions when you could get the same bang for less buck and simply add lots of mystery, or you write a serious book in which case any mystery aspects are superfluous- you have to assume that your book will eventually become famous, the plot will become common knowledge, and yet it should still have the same value even then.
Thus people crafting stories will either make a whodunnit and focus on mystery, or ignore the mystery and focus on the other positive aspects, because no matter what your aim, it is always counterproductive to try to do both at the same time.
I mean, think of all the great literature that are regarded as classics. Does any of it really become less interesting after you learn the plot? If that was the case, they wouldn't be regarded as classics in the first place. (The season finale of HBO's Rome is one of my favorites. It's the one where -spoilers!- Brutus murders Caesar. If only those asshole historians hadn't spoiled it, huh?) And what about people who will hear about something, read the plot synopsis on wikipedia, and then decide that it's interesting and they want to read the whole thing? Obviously some books (and film, and games...) are spoiler-immune.
So if you know for sure that you have something with artistic value, there's no question that spoilers are fun. To claim otherwise is to suggest that this thing has no artistic qualities besides, and is only interesting for its mystery value (hence not being interesting to people who have read it, quite paradoxically).
Moreover, as I will show next, there is no reason to read whodunnits in the first place. Not only are they artistically bankrupt, they are bankrupt, period. When you have a whodunnit, it is still okay (and better!) to spoil it, because once you spoil, the uh, spoilee will no longer be compelled by their curiosity to waste time reading a book which becomes worthless the moment they finish reading it. You're doing them a service. Regardless of the situation, you should always spoil everything.
Whodunnits
That's all well and good, but how is it that a whodunnit is objectively bad? People enjoy reading them sometimes, don't they? Then, by spoiling it, you are taking away their enjoyment, aren't you?
Well, true. However, we need to consider what a whodunnit is. A whodunnit is not great art, as established. In fact, it's not even a matter of degree; whodunnits are fundamentally different than other stories. Hedunnits tell a story as a way of artistic expression. A whodunnit is not art at all, and has no such pretensions, it is essentially a logic puzzle which just happens to be exist in narrative form.
Now, looked at in comparison with puzzles (and math problems) in general, I think whodunnits don't measure up in that way, either. They are the simplest, most worthless kind of logic puzzle. My reasoning is similar to hedunnits: A really good puzzle is good, because its solution is not a gimmick- it's a genuinely intellectually enriching thing, and even after you know the answer, the solution (or how that answer was obtained) is still very interesting by itself, and thus the puzzle loses almost no value even after being spoiled. The whodunnit as a puzzle is the worst kind of puzzle- there is just one simple trick to it and if you've seen it solved you know how to solve it, and the solution is so simple that you gain nothing from knowing it. It's a waste of time.
If you complain about spoilers on the basis of liking whodunnits, which you can only like as logic puzzles anyway as I earlier explained, then you are still better off just not bothering with them at all- if you like logic puzzles, there are far better logic puzzles out there and those are still interesting even after you spoil them.
Conclusion
So, in the end, for people who have no interest in whodunnits, a policy of casually spoiling things is perfectly fine. It's better than not spoiling, because if they realize that their interest wanes as they find out about the plot, they can safely deduce that it's a whodunnit you're talking about and it's not worth their time.
On the other hand, the people who like whodunnits, can only possibly like them for the logic puzzle aspects, and thus if they see their interest being diminished by spoilers, they can also safely say that it is a shoddy puzzle you're speaking of, and know that they can easily find much more worthwhile ones.
Once again, I've spoken mostly in the context of books, but obviously you can talk about a whodunnit film (just take a whodunnit novel and make a movie out of it... Not that a movie of a whodunnit book is not precluded from having artistic value- it's perfectly possible, otherwise no one would see a Sherlock Holmes movie). You can talk about a whodunnit video game. In fact, the whodunnit is most defensible in books, where the story is more central. It is far easier (as if it wasn't already easy) to think of a game that remains enjoyable after you know what happens. Hell, there's a whole class of games without any plot to begin with!
As an aside, since I mentioned mechanical spoilers in the beginning: I think Kieron's example is that in Amnesia, you don't die from being insane, although it sort of seems like you would. This is nonsense. For one, it make no sense for you to not save the game as soon as you encounter the insanity mechanic, and see exactly how much of it you can take before dying. As a player consciously attempting to beat the game, you are being stupid (ie strategically inefficient) by not doing so. There's nothing to spoil. And incidentally, for me anyway, the mechanic was still kind of creepy after figuring it out- it distracted me and enhanced the feeling of danger because I couldn't properly see what's going on.
I mean, it makes perfect sense if you believe in not spoiling things, to avoid spoiling mechanics. It seems like a slippery slope, really, where you can't talk about anything for fear of spoiling, but whatever. The point is, I don't believe in spoilers, and to me the "mechanical spoiler" is equally irrelevant.
PS: Actually, it seems spoiling a thing, even a whodunnit, makes it even more enjoyable, according to science.
20 January 2012
Reviews: To score or not to score?
For some time now, I was of the opinion that reviews should not have scores. After reading Alex Kierkegaard's writeup on the topic, I changed my mind.
Background
Alex's post is long. It's well written, and you should read it sometime, but I think I'll still be an enormous hypocrite and provide you with a summary of what he says anyway:
Alex also talks about perfect scores. He's wrong there: If you think 100/100 is a perfect score, I don't see why you can't or won't think 5/5 is a perfect score. It doesn't really matter to me- I don't have a problem with scores being out of 5 and not 100.
Another thing he complains about is close scores like, for example, 76/100 and 77/100. His position boils down to "I cannot imagine myself making very precise judgements about games, therefore it is impossible." It is a laughable position. Alex seems to have a habit of confusing the negligible with the actually non-existent: Just because it's hard to see that 76 to 77 difference, doesn't mean it doesn't exist. I agree that if you are really using a score system with 100 values (or worse, decimals!), you should probably think again. But that doesn't mean it's inherently bad, and it doesn't mean there can't be some guy out there who really can review games so finely that he can discern a 1/100 difference in quality (although, most that score out of 100 probably can't do it). This part is also tangential to my purposes.
Motivation
So what are my purposes, then? Well, as I said, it's clear that I can't just not score my reviews. That would be sticking my head in the sand. However, I have one problem with scores: Suppose you have a scoring system out of 100. You have 4 categories: Graphics, Gameplay, Story, Replay value. Each one gets a score out of 25, then you sum them all for the final score. Reasonable enough, and many mainstream reviewers actually do this. (To make it Alex-friendly, you can make each category 0-1 and then sum them to 0-4.)
Anyhow, the problem: With this scheme, Dwarf Fortress gets 0+25+0+25=50. But Dwarf Fortress isn't a mediocre game! To fix it, you can make it so that gameplay and replay value are out of 45, and others out of 5. Then DF gets 90. Cool, right? Yes, but now Limbo gets, oh, 5+30+5+0=40 if you are really generous. I mean, I didn't think Limbo was perfect1. But I certainly didn't think it was below average crap that deserves a 40/100.
So, for some games, graphics matter and replay value doesn't. For others, the opposite. Rather than come up with a complicated weighting scheme to solve this, I tried to find a lazy shortcut. I think I succeeded.
Solution
If you give a game a 10/10, what does that mean? Essentially, it's the same as saying, "dude, this game is awesome, you'll love it". 0/10 would be saying "piece of shit, don't bother". Reviews are, at their basest, for answering the question, "should I play this game?" Yes, they serve as commentary and can be very valuable in that respect as well, but that question is what gave rise to "reviews" in the first place.
So how would I deal with, say, DF, if I was to give scores? Probably I'd give it a 9/10, and say something to the effect of "if you like roguelikes with ASCII graphics, then it's really a 10/10, and if you really care about the graphics it's 6/10 with tilesets and 3/10 without". Tastes vary. Review audiences are heterogenous2.
However, the review isn't necessarily going to be an absolute endorsement (or disapproval), either. It will probably say, "some such people will like this, some such people will not". Now, if you see a 5/10 game, what if you can't tell whether you're the guy who will like it despite its flaws, or the guy who will definitely hate it?
Sometimes, it's obvious from reading the review. Oftentimes it's not. And in that case, you'll guess. And with a 5/10 score, you will probably guess that you're equally likely to be in either camp... Wait, hold on. Isn't 1/2 the chance of success for an unbiased binary trial? Hmm, what if... What if review scores are probabilities? What if, when I give a game score X out of Y, that means I'm estimating X/Y of my audience will like it, and consequently3, that there's an X/Y probability that you will like it?
Results
Yeah, I'm kinda proud of myself for this. I think it's a great idea - I'm perfectly happy with a score system like this, both as reviewer and review reader4. So how would it look in practice?
Now, I don't want to make 1% resolution estimates, there aren't even 100 people reading my reviews. So I will use this scale:
In fact, if I happen to decide that "indie game bias" is relevant for a game, I can just bump it up one level. That seems reasonable. If the devs are, say, literally curing cancer and disease, I can totally see bumping a game 2 levels. I like that - I'm okay with foldit being a 3/5 game, and I'm okay with treating it like a 5/5 game because of its mission.
Furthermore, the above may be written in the context of video games, but there's nothing about this system specific to video games. There's no reason not to use it for movies, books, what have you.
Lastly, the nice thing is that, while I've never heard of a reviewer using this system explicitly, all the review scores out there are very compatible with it. Good games are likely to get high scores, and you are likely to enjoy good games. Ergo, high score means more likely to enjoy. You can assume these are just traditional scores, too, if the "math" is confusing, but if basic probability confuses you, what on earth are you doing on my blog?
Footnotes:
1: If you look now, you will see my Limbo review does include a score. That was added after the fact, after this post was written.
2: I don't know if you can even target a homogenous audience of non-trivial size, but I know I wouldn't want to even if I could.
3: It's just basic probability. If a persons in a room like a game, and b persons don't, then when you pick one of them at random, the chance that you get someone who does like it is p=a/(a+b). Since you are only thinking about this because you have no idea which group you belong to, we can assume you are equally like to be any one of those persons. So the chances of you liking the game are also p, which is equal to the fraction of people who like it.
4: It also solves all sorts of problems we weren't even trying to solve: Among other things, it means that even if you buy a 9/10 game and hate it (or buy a 1/10 game and love it), that's fine, because it's a probabilistic prediction, and you are still better off trusting it (assuming the reviewer is trustworthy and reliable).
5: Two things you may notice: First, I'll never have to say you will definitely like a game, or definitely dislike it. Second, no matter how many times I'm wrong, I can always blame it on probability. Man, I'm so clever! Seriously, though: Sorry about this, but them's the breaks. I don't think a system that allows 0% or 100% probabilities would be productive, and I'm not sure if it would be mathematically sensible. Nor do I intend to find out.
Background
Alex's post is long. It's well written, and you should read it sometime, but I think I'll still be an enormous hypocrite and provide you with a summary of what he says anyway:
- It should be possible to read a review, and then answer the question, "Did this guy like the game? Was it a waste of his time? Does he regret playing it? Would he recommend it to others?" If you cannot answer this, the review is worthless, rambling drivel which literally does not make any sense.
- There is no such thing as a score-less review. To demonstrate, take any given group of reviews (ostensibly) without scores. Now label each review as "positive" or "negative". You should be able to do this easily due to 1. When done, go back and replace each "positive" with 1, and each "negative" with 0. Even though the reviewer did not give a score, you have correctly approximated the score that he would have given, with scores on a scale of 0 to 1. Now, repeat this procedure with 5 tags: Strongly positive/negative, mildly positive/negative, and neutral. Replace them with 1-5. You have now approximated a score on a more familiar 1-5 scale.
- It follows that a review, any review, even if it claims to not assign a score, must describe a score implicitly. That is, even if there isn't a score, you can read it and say, oh, this looks like a 6/10 (from 2). If you can't say this, then the review is nonsense (from 1).
Alex also talks about perfect scores. He's wrong there: If you think 100/100 is a perfect score, I don't see why you can't or won't think 5/5 is a perfect score. It doesn't really matter to me- I don't have a problem with scores being out of 5 and not 100.
Another thing he complains about is close scores like, for example, 76/100 and 77/100. His position boils down to "I cannot imagine myself making very precise judgements about games, therefore it is impossible." It is a laughable position. Alex seems to have a habit of confusing the negligible with the actually non-existent: Just because it's hard to see that 76 to 77 difference, doesn't mean it doesn't exist. I agree that if you are really using a score system with 100 values (or worse, decimals!), you should probably think again. But that doesn't mean it's inherently bad, and it doesn't mean there can't be some guy out there who really can review games so finely that he can discern a 1/100 difference in quality (although, most that score out of 100 probably can't do it). This part is also tangential to my purposes.
Motivation
So what are my purposes, then? Well, as I said, it's clear that I can't just not score my reviews. That would be sticking my head in the sand. However, I have one problem with scores: Suppose you have a scoring system out of 100. You have 4 categories: Graphics, Gameplay, Story, Replay value. Each one gets a score out of 25, then you sum them all for the final score. Reasonable enough, and many mainstream reviewers actually do this. (To make it Alex-friendly, you can make each category 0-1 and then sum them to 0-4.)
Anyhow, the problem: With this scheme, Dwarf Fortress gets 0+25+0+25=50. But Dwarf Fortress isn't a mediocre game! To fix it, you can make it so that gameplay and replay value are out of 45, and others out of 5. Then DF gets 90. Cool, right? Yes, but now Limbo gets, oh, 5+30+5+0=40 if you are really generous. I mean, I didn't think Limbo was perfect1. But I certainly didn't think it was below average crap that deserves a 40/100.
So, for some games, graphics matter and replay value doesn't. For others, the opposite. Rather than come up with a complicated weighting scheme to solve this, I tried to find a lazy shortcut. I think I succeeded.
Solution
If you give a game a 10/10, what does that mean? Essentially, it's the same as saying, "dude, this game is awesome, you'll love it". 0/10 would be saying "piece of shit, don't bother". Reviews are, at their basest, for answering the question, "should I play this game?" Yes, they serve as commentary and can be very valuable in that respect as well, but that question is what gave rise to "reviews" in the first place.
So how would I deal with, say, DF, if I was to give scores? Probably I'd give it a 9/10, and say something to the effect of "if you like roguelikes with ASCII graphics, then it's really a 10/10, and if you really care about the graphics it's 6/10 with tilesets and 3/10 without". Tastes vary. Review audiences are heterogenous2.
However, the review isn't necessarily going to be an absolute endorsement (or disapproval), either. It will probably say, "some such people will like this, some such people will not". Now, if you see a 5/10 game, what if you can't tell whether you're the guy who will like it despite its flaws, or the guy who will definitely hate it?
Sometimes, it's obvious from reading the review. Oftentimes it's not. And in that case, you'll guess. And with a 5/10 score, you will probably guess that you're equally likely to be in either camp... Wait, hold on. Isn't 1/2 the chance of success for an unbiased binary trial? Hmm, what if... What if review scores are probabilities? What if, when I give a game score X out of Y, that means I'm estimating X/Y of my audience will like it, and consequently3, that there's an X/Y probability that you will like it?
Results
Yeah, I'm kinda proud of myself for this. I think it's a great idea - I'm perfectly happy with a score system like this, both as reviewer and review reader4. So how would it look in practice?
Now, I don't want to make 1% resolution estimates, there aren't even 100 people reading my reviews. So I will use this scale:
- 1: a game only an indy dev could love - 10% chance you'll like it; 10% of my audience will like it.
- 2: mostly shit, but has noteworthy positive qualities - 30% chance you'll like it; 30% of my audience will like it.
- 3: absolutely mediocre - 50% chance you'll like it; 50% of my audience will like it.
- 4: recommended, but not for everyone - 70% chance you'll like it; 70% of my audience will like it.
- 5: if you don't like this, you don't have a soul - 90% chance you'll like it; 90% of my audience will like it.
In fact, if I happen to decide that "indie game bias" is relevant for a game, I can just bump it up one level. That seems reasonable. If the devs are, say, literally curing cancer and disease, I can totally see bumping a game 2 levels. I like that - I'm okay with foldit being a 3/5 game, and I'm okay with treating it like a 5/5 game because of its mission.
Furthermore, the above may be written in the context of video games, but there's nothing about this system specific to video games. There's no reason not to use it for movies, books, what have you.
Lastly, the nice thing is that, while I've never heard of a reviewer using this system explicitly, all the review scores out there are very compatible with it. Good games are likely to get high scores, and you are likely to enjoy good games. Ergo, high score means more likely to enjoy. You can assume these are just traditional scores, too, if the "math" is confusing, but if basic probability confuses you, what on earth are you doing on my blog?
Footnotes:
1: If you look now, you will see my Limbo review does include a score. That was added after the fact, after this post was written.
2: I don't know if you can even target a homogenous audience of non-trivial size, but I know I wouldn't want to even if I could.
3: It's just basic probability. If a persons in a room like a game, and b persons don't, then when you pick one of them at random, the chance that you get someone who does like it is p=a/(a+b). Since you are only thinking about this because you have no idea which group you belong to, we can assume you are equally like to be any one of those persons. So the chances of you liking the game are also p, which is equal to the fraction of people who like it.
4: It also solves all sorts of problems we weren't even trying to solve: Among other things, it means that even if you buy a 9/10 game and hate it (or buy a 1/10 game and love it), that's fine, because it's a probabilistic prediction, and you are still better off trusting it (assuming the reviewer is trustworthy and reliable).
5: Two things you may notice: First, I'll never have to say you will definitely like a game, or definitely dislike it. Second, no matter how many times I'm wrong, I can always blame it on probability. Man, I'm so clever! Seriously, though: Sorry about this, but them's the breaks. I don't think a system that allows 0% or 100% probabilities would be productive, and I'm not sure if it would be mathematically sensible. Nor do I intend to find out.
04 December 2011
Review: Skyrim (PC) - Why console ports suck
Let me say again, to be clear: I don't like console games, and Skyrim is a console game. I don't like a lot about Skyrim, and most of my gripes have to do with it being a console game (although a few big ones don't), so a lot of this post will be me complaining about console ports.
By the way, it's not like Skyrim is awful. I don't hate everything about it (obviously, any game that shares lore with Morrowind already gets points from the start, as does any RPG with lots of skills that are improved by using them). But there really isn't anything non-trivial so far that I really like, so I'll defer on talking about the things I like until I'm done with the game (I'm not sure yet if I'll have the willpower and/or time to even finish the main quest). So, this post will mostly be me complaining.
Console vs. PC
As I said, I don't like console games. It's not a very fundamental dislike. If you think about it, what is the basic difference between a PC and console? You could get from the former to the latter in two steps: First, make the hardware monolithic, non-customizable, proprietary, and so on (incidentally, this essentially produces a Mac). Then, gut the OS to make it capable of only playing games. You have pretty much made a console, except because we started with a PC, it's still displaying things on a monitor and you still use a keyboard and mouse for input.
That's not a big deal, because input/output method is independent from the platform: You could hook up a mouse and keyboard to a console, and you could use a TV and gamepad (with no mouse or keybaord) to -albeit clumsily- control your garden variety Windows PC. Of course, in practice nobody seems to do these things, and this is the crux of my issue with console games, but before I get to that: There is nothing really wrong so far. If anything, monolithic hardware and a minimal OS dedicated to games are good, because they make coding games easier, although there's the issues of compatibility... But none of this really influences game design. And it's obvious that console games are different than PC games in certain key ways.
That difference in game design stems from the difference in traditional input/output setups. As I said, PCs, or rather PC users, tend to use monitors and keyboards while console users go with gamepads and TVs. It doesn't have to be this way, as far as I know it's not terribly difficult to use a keyboard with a console and vice versa. However, for some reason, nobody does.
With the PC, of course, it's stupid to use a gamepad instead of a keyboard unless you are playing a game, and even then, it makes sense only for some games. Some people already do use their TV as a monitor (I have) and a lot simply don't have a TV and use their monitor to watch TV stuff.
With consoles, there really is no reason to not use a monitor and keyboard/mouse. There is no reason for this not to be standard- except most console games are made for playing with a gamepad, but that's really only because it's the standard. I guess historically, consoles came about because back in the day, you couldn't expect people to buy computers for playing games on them. Now, the price difference is no longer there.
Note that for the most part, a console's I/O scheme is a subset of a PC's. If you try, you *could* devise a hypothetical game which uses the buttons and analog sticks of a gamepad in such a way that you simply can't play it with a keyboard and mouse- but such games are almost non-existent. For instance, console game like to assign movement to the stick, and you can control walk/run speed with how much you push it, but in most games you never want to and almost never need to run at anything but the maximum speed. It's not that you can't make a game where sometimes you want to walk slowly- it's just that seemingly nobody does. The one exception is detailed flight and racing sims: things like fine steering adjustments often do matter, and they are impossible on a keyboard and clumsy with a mouse while an analog stick does work quite well, except people who play those seems to get specialized controllers anyway, even on consoles.
Notably, you just can't make an FPS game on a console- anything that involves pointing at things on the screen a lot is torture without a mouse. Yes, people try anyway, and I think the attempts suck and will be doomed to suck until the console market gets used to the idea of using mice and keyboards. Yes, you "could" make a PC-style shooter on a console, meant to be played with a mouse, but it won't sell well because most console users won't want to buy or use a mouse just for your game.
Notice that it's harder to make a console RTS: Not only do you need to convince users to buy and use a mouse, but you also need them to sit close enough to the TV to see the small units and interface elements. The mouse, by the way, isn't just for selecting units- any game with many complicated menus will be difficult to handle without a mouse and keyboard.
To solidify the point, observe that in terms of design, it's trivial to port a console game: You do have to deal with different APIs and such under the hood, but you have to change almost nothing about how the game is played- just remap the controls to a keyboard and bind the camera to the mouse. Whereas porting a PC game to the console, especially a "real" PC game which heavily uses the mouse and keyboard to their full potential, would be a ridiculously difficult task. Moreover, there is no genre of console games which does not exist on the PC- some, like platformers, aren't as numerous, but you certainly can't say no notable PC platformer exists in the same way that you can say no notable console strategy game exists.
Complexity ceiling
Hopefully, I've made clear the following: There are kinds of games which can realistically (as in, more than 3 people will actually buy and play them) be made on a PC, and kinds of games which can realistically be made on a console. Practically all of the former can be ported to the PC, although not all developers choose to do so, while there is a distinct class of PC games that cannot be ported to consoles. Note that I'm saying kind- most console action RPGs aren't ported, for instance, that doesn't mean they can't be, as those few that are ported clearly demonstrate.
As I said earlier, I don't like console games. That's not because they're bad games- sure, many suck, just as many PC games suck. It just so happens that even "good" console games are so rarely interesting to me. Now, not every game I like tends to be complex, but most are. And complex games, along with FPSs, are just those kinds which go into the "wouldn't work on a console" category.
Furthermore, with an RPG like a TES game, there's really two aspects of appeal: The lore and the complicated complicated mechanics. And when you have the added constraint of making a game that would work on consoles, too, inevitably some things go. You just can't have a million attributes- it would generate a large amount of data. Displaying copious information is something a PC setup can deal with, but with a console you need the text to be large and legible from a distance, so there is a very finite amount of text, and therefore information, you can display. The same goes for things like interface icons and elements in general- they must be large and visually distinct to avoid confusion.
Or rather, now that I think of it, I don't see why you can't have lots of small text in console games. I've never found it particularly impractical to view lots of text/symbols on a TV, from a distance. But it seems that developers at least believe that console game interfaces should be as simple as possible.
The second factor comes in when you consider that a game which has a large amount of information to display needs to provide effective ways to navigate and manipulate that information. Again, a gamepad is not very well suited to this task, and keyboards and mice have, for decades, proven to be ideally suited to it.
The Interface
To illustrate, I'd like to contrast the interfaces of Morrowind and Skyrim. Morrowind, of course, had a single "status" screen which showed the inventory, map, stats and spells, these have been split up into four items you get when you press Tab in Skyrim. For my purposes, I'll focus on Skyrim's inventory screen.
![]() |
Morrowind status screen. |
There's really a lot I like about Morrowind's UI, such as being able to move around the parts and what not. But the crucial part is this: It's just far more efficient. First, this whole thing pops up with one keystroke. Suppose you need to answer a basic question, such as "do I have a pickaxe?". One key, done. Maybe click and drag the scrollbar or click an inventory filter if you have a lot of stuff on you. In Skyrim, you need to press Tab, lose half a second to the cute fading animations, press up to select inventory, another animation, press left to get to the leftmost column, then press up or down a few times to select weapons, and then keep pressing up or down until you get to "P":
![]() |
Skyrim inventory. |
Now I realize this seems silly, but those few milliseconds of animation, and those few extra keystrokes really do matter: This is something you do thousands of times over the course of the game. It just cannot be tedious.
What's worse is, when the interface is annoying to use, as a player you try to avoid using it as much as possible. That means having only one weapon and/or spell, and switching as little as possible- so the least frustrating way to play the game also happens to be the most boringly simple cookie-cutter way, which I find tragic in a game which is noteworthy due to the potential it has for complexity. If I wanted to play a game where all I did was swing an axe at enemies from start to finish, I wouldn't play a TES game- I'd play hack and slash action RPG.
Ideally, for instance, I would play a versatile magic-using character. Whenever I am attacked, I would cast an armor buff, the appropriate elemental resistance buff for the enemy. Then I would soften up the enemy with an appropriate damage spell (fire for ice wraiths etc), maybe conjure a suitable creature, then cast an attack buff and finish the enemy with my weapon. Maybe I'd even cast a bound weapon spell! Then, if I run out of health, I would heal myself, or if it looks like I'm overwhelmed I'd cast invisibility or something similar and escape.
That would actually be fun! I would have a wide variety of choices (spells, potions and weapons) at my disposal, each suited to different tasks, and I would have to decide on the fly which ones are applicable to that situation. It would be a game that actually gives me something to think about. It was what Morrowind did, and that is why Morrowind was fun. Skyrim, annoyingly, gets the hard part right: It supplies you with the options (unlike many other RPGs which have tons of spells, weapons and what not, most of which are utterly useless). It supplies you with the variety of problems that have different solutions (even though most are ultimately "kill the enemies", the enemies must be killed in very different ways, which is again something most games like this forget about).
The problem is, actually doing the above in Skyrim is torture. Just the amount of time you would waste watching the animations would easily waste a minute per encounter, not to mention breaking the flow. The amount of scrolling through menus would be obscene. It's just not something you can really do- and on either point, I know, because I tried it! A few times, just for the heck of it, I did try to do the many spells approach. The interface aside, it was very enjoyable, and very effective. Unfortunately, it was just such a hassle that I can not do it on a regular basis, once every few minutes, in a game that is dozens of hours long. In the end, I just use the same fire spell on everyone unless I absolutely can't win without using other spells too, and even then there's the temptation of simply turning god mode on briefly just to avoid having to deal with the obnoxious interface to implement a tactic I know will work.
And yes, I know there is an inventory/magic key in Skyrim. I am already using it. Guess what, you still have to watch the animation! It's still not instantaneous like Morrowind's was! And, no, the favorites menu doesn't work either: First off, it shows a ridiculously small number of items, and you still have to scroll.
Now, just to convince you that this is a systemic issue, and not just the Skyrim devs being incompetent: Let's consider why the Morrowind interface is better. First off, it can display more spells, because they are in a smaller font. Scrolling is easier because you have a scroll bar. You can have several screens at once, because when navigating you don't have to go from "select weapon mode" to "select screen mode" and then select the spell screen, and then select the spell. You just click your mouse on whichever element of whichever screen you want. There is a reason why the mouse became so popular, guys.
By the way, I just wanted to mention the stats. Note look at how Morrowind displays them: The window is actually quite small (when I play Morrowind I usually close the useless map and make the other ones larger, not to mention that Morrowind supports resolutions higher than the 1024x768 pictured, with the same font and icon size). But anyway, you can still see all of the attributes, all three status bars with numbers (when drinking potions in Skyrim, you are told how many points it will heal, and are shown the appropriate status bar, but you don't see what the exact numbers of it are), all major and minor skills, and even some other skills. You can't see all skills, because Morrowind had a fuckload of skills, but this UI still shows you much more information than Skyrim's UI, and in a quarter of the screen space.
What's more, look at the inventories: One thing Morrowind can afford to do is show cryptic icons for everything which don't necessarily display all the information about the item. This works, because you can mouse over the icon to see what it means. If it's a common icon, over time you effortlessly come to memorize the meaning, and don't need to do that. Note another thing about Morrowind's UI that you cannot see: You could assign any items or spells you want to your number keys, which really helped things- you didn't even have to go into this screen. The combination of the mouse and no constraint on font or icon size is what makes the interface so much more useful- which is exactly what consoles cannot have, so long as the current prevalent idea about interfaces of console games persists.
The Skyrim interface, if you took out the stupid things like fade-ins, probably is as efficient as you can get, when you are constrained with a gamepad. That said, with a mouse and keyboard, as well as a close-by monitor, you can do a much, much better job. The interface isn't my only problem with Skyrim, as I said, but if the PC port's interface was designed specifically for the PC (what you can't see in the screenshots is the embarrassingly buggy mouse support, by the way), I would have probably loved this game despite the other shortcomings.
That's not to say Skyrim's interface design does not reveal some stupendously bad decisions: I won't list all of them, but most obviously, why is the favorites menu only in a tiny corner of the screen? Bringing it up pauses the game, you usually have much more things favorited than the 3 or 4 items it can display at a time, and laying things out in a 2D table would make navigtating much easier, even with a gamepad.
16 May 2011
Germany's national character and unique traits in epic strategy
So I have been reading the article on Flash of Steel on Germany’s national character. I have to say it's a very interesting topic to look at and Troy Goodfellow does it plenty of justice.
I don't know what he is planning for it, but to me each of his posts just goes to show that the "national character" idea is thoroughly silly.
To be clear, I'll be speaking of computer strategy games such as Civilization which cover a long time period. How long? Age of Empires 3 is probably the shortest.
First off, the concept of a nation has changed a lot throughout history, and people did not always act as the nation-states we seem to be assuming they did. The Germany in World War 2 is obviously not just "Teutons with more tech". So when we have a "Germany" in games such as Civ, where this Germany remains Germany from the ancient eras into the future, we are already suggesting a very bizarre world which functions much more differently from ours. So there's already a problem with translating historical "Germany" which hasn't yet lasted 150 years (if what you mean by Germany is that state that Bismarck created which later went on to enter the two world wars). When you start writing up a "Germany" civ for your game, do you draw ideas from Nazi Germany? Western Germany? Today's Germany? Prussia? The Holy Roman Empire? The Teutonic tribes? As I said, they are not one and the same, and they don't share "traits". Or do you mash them all together into big ball of nonsense?
Second, there's the issue of traits themselves. In the last 50 or even 100 years, one thing Germany had a very well-known reputation for is excellence in engineering and manufacturing. Not really the ability to churn out a lot, but producing high-quality, reliable, well-designed machines. Think of the Mercedes-Benz automobiles, supposedly built like tanks. Somewhat relatedly, another thing people think of in regard to Germans of today is discipline. (To go off on a tangent, Germany has had a huge population of Turkish migrants since the 60s/70s which have not always been crazy about integrating, and I understand they have been a subject of much controversy there over the years, and still are. This has gone on for longer than WW2 and certainly is a huge contributor to what Germany is today, but you don't see that in any strategy games.)
Now I'm sure nobody has any funny ideas about how Germans have some genetic predisposition to being good at making reliable cars, or being disciplined. Again, it's not like the Teutons (or those before them) were much disciplined, and it's not like the Germans are really Teutons, and it's not like the Germans (of the last 150 years) have been around as a group for long enough to develop a meaningfully distinct gene pool. So it's a cultural thing.
But a culture of discipline, or technical excellence, or what have you does not just pop out of nowhere. It develops gradually over time, as a result of the environment in which a group of people exist, as well as other cultures they are in contact with, their history, and most likely also events of random chance. If the German people are disciplined today, it is because of their history.
But games like Civ are all about taking a blank slate and rewriting history. If you picked Germany, ended up alone on an island, focused only on culture, never entered a conflict let alone lose a world war and sign as overwhelming a treaty as the Versailles, why SHOULD your Germany have Panzers and disciplined troops? The circumstances which created those are simply not there! It should have crappy tanks, and crappy troops, because your people have never cared about war.
One could say, "But it's boring if Civ had only one civilization". And that's true. But once you notice the problems I've talked about, it just gets more confusing the more you think. So why not have the game model socio-cultural evolution? Why not start everyone without unique traits, using the civ only to select your city names (you gotta have SOME character, right?), and then grant unique traits to players over time based on how they have played?
Suppose you fought a big war (the game could look at how many resources’ worth of units were killed on both sides to tell a world war from a regional skirmish, for example, or the length of the conflict, or if the top 5 players are involved in it) and surrendered, having to give the victor a great deal of free stuff to convince them. Perhaps the game would look at whether you gave up any cities, whether the gold you must pay per turn is above a number or above a percentage of your GDP. If you pass the check for getting your ass kicked hard enough, you get a pop up: “National Socialist Revolution: Your armies are now more powerful, you get a bonus to production, and you can produce the following unique unit, which is a stronger version of the unit whose prerequisite tech you have most recently discovered.” Perhaps there would be drawbacks too. Perhaps suffering a big defeat again could lead to a “Leader deposed” message which revokes your traits. Perhaps when a trait is revoked, you get another trait which pushes you in the opposite direction. (to reflect the fact that Germany essentially lost two world wars, yet reacted very differently to the two, and to give the player some extra agency, the dialog could let you choose whether you accept the trait)
Some traits could only be attainable in certain eras. Some traits could be negated by a tech- even if discovered by other players. “The discovery of TECH by PLAYERCOUNTRY has spread to and disillusioned your people and you no longer receive the bonus from TRAIT.” Or perhaps so long as you refuse to trade or research the technology, your people remain sufficiently oblivious to keep giving you the bonus. Perhaps they don’t like you using this strategy, or perhaps if your empire has a history of being on the bleeding edge of science, putting off a certain tech makes them very unhappy, and if you have always lagged behind your people won’t care about the crazy customs of the foreigners.
You could have traits that work like skills in Morrowind-style RPGs: with every wonder you build you get a bonus to building wonders. Once you don’t build any for a while, the bonus decays as the culture of erecting monuments becomes a thing of the past for your people. Perhaps certain drastic events, such as large wars, significant defeats or victories, global climate events (with non-static traits suddenly it makes a lot of sense to have random global events), plagues, political/social/artistic movements (triggered by research?)…
Perhaps the game noticed that you haven’t been acquiring new cities for centuries, but recently discovered a new continent and have rapidly expanded there. It doesn’t need to know about “discovery of new land”, just looking for a spike in your cities found over time graph is enough. In that case you get a prompt, sacrifice a lot of economic gain from the new cities (penalty to gold production?) or risk revolt. If you do risk revolt, you better have the military strength to control the new lands on call, or you might end up with an American revolution like Britain once did.
And on that note, why is it that Civ-like games start with a number of “nations”, and at most the number decreases as time goes on? You could say that two thousand years ago, Europe “started” with one nation, Rome. And today, we have… Certainly not less. Why not occasionally throw up a message, “The cities of X, Y and Z are dissatisfied with your rules and are seceding. They call themselves PLAYER!”. Suddenly, the named cities change to a new color, and henceforth are controlled by a new AI player. Much like Civ5’s city states, you could make such rebel players not compete for global victory to make things even more interesting. The very act of fighting a civil war could also serve as a base or trigger for yet more traits. What’s nice is that Civ games, and many others , have long had happiness penalties associated with empire size, and the revolt very nicely builds on top of that. Now you can actually piss off your populace to such a degree as to spawn a new enemy, and not just refuse building tanks for a few turns.
Something like these “traits” already exists in Civ games: Great persons. It’s more complicated on the whole but for Civ5 generals at least, every time you kill a unit you get a chance to receive a great general. (The name is randomly selected but if you are German, it should be a great German general, or perhaps even a great German general from the era you are currently in, or fictional for cases like Aztecs in 1937) So in the end, if you go to war a lot, you get great generals. Perfect!
Why not extend this? Every time you move a unit into a forest tile, there could be a 0.1% chance of receiving a trait that negates movement penalty for forests and gives a small combat bonus. (To make it less dependent on chance, you could say that every time you move into a forest, there’s 10% chance for the game engine to secretly assign a “forest point” to you, which of course decay with time, and once you get 100 points you get the trait.) Suddenly, players who have spawned near lots of hills get bonuses and perhaps unique units specializing in, hills! (just like the Inca in Civ5).
Since Civ AIs already act as if there is such a trait system in place (e.g. Montezuma always wants to fight as much as possible, as if to get war-related traits) you will have AI Aztecs really acting like Aztecs, and really having the historically appropriate traits. Whereas the player will be able to make use of his slightly exaggerated agency to take Mongolia, and built it into a scientific and economic forerunner of the modern world- change history in a meaningful way, according to his wishes, and force his own empire, with its own character- a premise that could be realized far better than any game has been able to do so far, I think.
The traits themselves could even be generated semi-randomly like loot in RPGs such as Torchlight, along with “unique” traits corresponding to important real-world events. The same goes for unique units.
I don't know what he is planning for it, but to me each of his posts just goes to show that the "national character" idea is thoroughly silly.
To be clear, I'll be speaking of computer strategy games such as Civilization which cover a long time period. How long? Age of Empires 3 is probably the shortest.
First off, the concept of a nation has changed a lot throughout history, and people did not always act as the nation-states we seem to be assuming they did. The Germany in World War 2 is obviously not just "Teutons with more tech". So when we have a "Germany" in games such as Civ, where this Germany remains Germany from the ancient eras into the future, we are already suggesting a very bizarre world which functions much more differently from ours. So there's already a problem with translating historical "Germany" which hasn't yet lasted 150 years (if what you mean by Germany is that state that Bismarck created which later went on to enter the two world wars). When you start writing up a "Germany" civ for your game, do you draw ideas from Nazi Germany? Western Germany? Today's Germany? Prussia? The Holy Roman Empire? The Teutonic tribes? As I said, they are not one and the same, and they don't share "traits". Or do you mash them all together into big ball of nonsense?
Second, there's the issue of traits themselves. In the last 50 or even 100 years, one thing Germany had a very well-known reputation for is excellence in engineering and manufacturing. Not really the ability to churn out a lot, but producing high-quality, reliable, well-designed machines. Think of the Mercedes-Benz automobiles, supposedly built like tanks. Somewhat relatedly, another thing people think of in regard to Germans of today is discipline. (To go off on a tangent, Germany has had a huge population of Turkish migrants since the 60s/70s which have not always been crazy about integrating, and I understand they have been a subject of much controversy there over the years, and still are. This has gone on for longer than WW2 and certainly is a huge contributor to what Germany is today, but you don't see that in any strategy games.)
Now I'm sure nobody has any funny ideas about how Germans have some genetic predisposition to being good at making reliable cars, or being disciplined. Again, it's not like the Teutons (or those before them) were much disciplined, and it's not like the Germans are really Teutons, and it's not like the Germans (of the last 150 years) have been around as a group for long enough to develop a meaningfully distinct gene pool. So it's a cultural thing.
But a culture of discipline, or technical excellence, or what have you does not just pop out of nowhere. It develops gradually over time, as a result of the environment in which a group of people exist, as well as other cultures they are in contact with, their history, and most likely also events of random chance. If the German people are disciplined today, it is because of their history.
But games like Civ are all about taking a blank slate and rewriting history. If you picked Germany, ended up alone on an island, focused only on culture, never entered a conflict let alone lose a world war and sign as overwhelming a treaty as the Versailles, why SHOULD your Germany have Panzers and disciplined troops? The circumstances which created those are simply not there! It should have crappy tanks, and crappy troops, because your people have never cared about war.
One could say, "But it's boring if Civ had only one civilization". And that's true. But once you notice the problems I've talked about, it just gets more confusing the more you think. So why not have the game model socio-cultural evolution? Why not start everyone without unique traits, using the civ only to select your city names (you gotta have SOME character, right?), and then grant unique traits to players over time based on how they have played?
Suppose you fought a big war (the game could look at how many resources’ worth of units were killed on both sides to tell a world war from a regional skirmish, for example, or the length of the conflict, or if the top 5 players are involved in it) and surrendered, having to give the victor a great deal of free stuff to convince them. Perhaps the game would look at whether you gave up any cities, whether the gold you must pay per turn is above a number or above a percentage of your GDP. If you pass the check for getting your ass kicked hard enough, you get a pop up: “National Socialist Revolution: Your armies are now more powerful, you get a bonus to production, and you can produce the following unique unit, which is a stronger version of the unit whose prerequisite tech you have most recently discovered.” Perhaps there would be drawbacks too. Perhaps suffering a big defeat again could lead to a “Leader deposed” message which revokes your traits. Perhaps when a trait is revoked, you get another trait which pushes you in the opposite direction. (to reflect the fact that Germany essentially lost two world wars, yet reacted very differently to the two, and to give the player some extra agency, the dialog could let you choose whether you accept the trait)
Some traits could only be attainable in certain eras. Some traits could be negated by a tech- even if discovered by other players. “The discovery of TECH by PLAYERCOUNTRY has spread to and disillusioned your people and you no longer receive the bonus from TRAIT.” Or perhaps so long as you refuse to trade or research the technology, your people remain sufficiently oblivious to keep giving you the bonus. Perhaps they don’t like you using this strategy, or perhaps if your empire has a history of being on the bleeding edge of science, putting off a certain tech makes them very unhappy, and if you have always lagged behind your people won’t care about the crazy customs of the foreigners.
You could have traits that work like skills in Morrowind-style RPGs: with every wonder you build you get a bonus to building wonders. Once you don’t build any for a while, the bonus decays as the culture of erecting monuments becomes a thing of the past for your people. Perhaps certain drastic events, such as large wars, significant defeats or victories, global climate events (with non-static traits suddenly it makes a lot of sense to have random global events), plagues, political/social/artistic movements (triggered by research?)…
Perhaps the game noticed that you haven’t been acquiring new cities for centuries, but recently discovered a new continent and have rapidly expanded there. It doesn’t need to know about “discovery of new land”, just looking for a spike in your cities found over time graph is enough. In that case you get a prompt, sacrifice a lot of economic gain from the new cities (penalty to gold production?) or risk revolt. If you do risk revolt, you better have the military strength to control the new lands on call, or you might end up with an American revolution like Britain once did.
And on that note, why is it that Civ-like games start with a number of “nations”, and at most the number decreases as time goes on? You could say that two thousand years ago, Europe “started” with one nation, Rome. And today, we have… Certainly not less. Why not occasionally throw up a message, “The cities of X, Y and Z are dissatisfied with your rules and are seceding. They call themselves PLAYER!”. Suddenly, the named cities change to a new color, and henceforth are controlled by a new AI player. Much like Civ5’s city states, you could make such rebel players not compete for global victory to make things even more interesting. The very act of fighting a civil war could also serve as a base or trigger for yet more traits. What’s nice is that Civ games, and many others , have long had happiness penalties associated with empire size, and the revolt very nicely builds on top of that. Now you can actually piss off your populace to such a degree as to spawn a new enemy, and not just refuse building tanks for a few turns.
Something like these “traits” already exists in Civ games: Great persons. It’s more complicated on the whole but for Civ5 generals at least, every time you kill a unit you get a chance to receive a great general. (The name is randomly selected but if you are German, it should be a great German general, or perhaps even a great German general from the era you are currently in, or fictional for cases like Aztecs in 1937) So in the end, if you go to war a lot, you get great generals. Perfect!
Why not extend this? Every time you move a unit into a forest tile, there could be a 0.1% chance of receiving a trait that negates movement penalty for forests and gives a small combat bonus. (To make it less dependent on chance, you could say that every time you move into a forest, there’s 10% chance for the game engine to secretly assign a “forest point” to you, which of course decay with time, and once you get 100 points you get the trait.) Suddenly, players who have spawned near lots of hills get bonuses and perhaps unique units specializing in, hills! (just like the Inca in Civ5).
Since Civ AIs already act as if there is such a trait system in place (e.g. Montezuma always wants to fight as much as possible, as if to get war-related traits) you will have AI Aztecs really acting like Aztecs, and really having the historically appropriate traits. Whereas the player will be able to make use of his slightly exaggerated agency to take Mongolia, and built it into a scientific and economic forerunner of the modern world- change history in a meaningful way, according to his wishes, and force his own empire, with its own character- a premise that could be realized far better than any game has been able to do so far, I think.
The traits themselves could even be generated semi-randomly like loot in RPGs such as Torchlight, along with “unique” traits corresponding to important real-world events. The same goes for unique units.
Subscribe to:
Posts (Atom)