Jump to content

Student of Trinity

Member
  • Posts

    6,622
  • Joined

  • Last visited

Posts posted by Student of Trinity

  1. We've got a thread about books and now a thread about games, and the game thread is pretty harsh on games. That's good, actually; sometimes you learn something by waking up grumpy. When I started thinking of things I didn't like in games, I realized I was essentially complaining that the games weren't good books. But then sometimes my complaints about books are essentially that the books are bad games. So I thought of making a thread about comparing the things that go wrong in books and in games.

     

    Over the past year working on my hobby-book I've learned some things about how an adventure novel can go wrong, and one of the things I've noticed going wrong for me has been a tendency on my part to write like a Dungeon Master, carefully showcasing settings and situations and NPCs, but leaving the plot to be a series of improvised leaps between pre-ordained cutscenes concocted just to be cool. When I began my project I thought that I could make a good plot just by nitpicking those improvised leaps until the logic all checked out, but I've found that nitpicky logic just isn't a substitute for real plot coherence, where it's always clear what's basically happening, and what's basically happening makes sense and is interesting in itself. A reader's flow chart loops repeatedly through the 'Know what's happening?' decision node, and there is no line that leads from 'No' to a nitpicking subroutine of reconciling mechanical details. You have to make sure the answer is never No.

     

    This constraint seems to be trickier for games, because there's a fine line between having plot momentum and feeling railroaded as a player.

     

    Is the best game one for which an LP ('let's play' — a story-ization of a play-through) is a good book? Is the best book one that could easily be turned into a game?

     

    Any other thoughts on books versus games?

  2. That's a good graphic; that's about what it's like.

     

    Oh, maybe not entirely. If you can keep some sort of perspective, enough to not start imagining that you know everything because you know your one thing, you can maybe transfer a kind of meta-knowledge from your tiny specialty to almost everything. You know what it's like to understand something, and you know even better what it's like to not understand something but kid yourself that you do. So you can maybe recognize those things when you read something even well outside of your own field. It won't be the material that you know, but the person understanding it (or not) will be a mind operating much like your own.

  3. Yeah, UK education narrows the fastest of the systems I know. I'm not sure it compensates enough by getting deeper faster, too, or if that would really be a compensation, anyway. Presumably the assumption is that you can pick up a broad but shallow education by yourself, just be reading Wikipedia or something, so you may as well go narrow for your formal training. That's maybe not a bad idea, if people actually do it.

     

    For all the limitations of North American education, you can stay quite broad until quite late. A typical bachelor's degree with honors (so four years, and the normal prerequisite for graduate school) is about forty semester-long courses, and only about half of those are normally in one subject.

     

    The main point of the PhD is that you're not just learning stuff that other people already know any more. You're supposed to be exploring entirely new territory yourself, and telling the rest of the world about it. So in a sense that's finally the point of all the specialization: you pick a narrow enough topic that you can literally become the greatest expert in the world on it within a few years.

  4. Yeah, that sounds pretty broad. 19th and 20th century means "almost all of literature" and poststructuralism is probably not so far short of "almost all of criticism". Taking several years to figure out how you can really contribute is probably smart — if you can afford the time.

  5. Are they phasing out (or have they already done away with?) comprehensive exams over in Europe? My department is ditching them in favor of having students construct a portfolio, and I've heard of other history departments doing the same thing. I presume the sciences have, or at least had, something like comps.

     

    Germany has never had such a thing as comprehensive exams, and the modern PhD was a German invention of the 19th century. The German doctorate is just a thesis, period, and as far as I know that's how it has always been. I'm afraid I don't know much about other countries in Europe. I believe the Netherlands and Switzerland and Scandinavia are broadly similar to Germany. France is probably quite different, but I wouldn't know: the French system is virtually closed. Hardly any non-French people go into it, and hardly any French people leave it. So French science is certainly strong, but French higher education is a black box to me. I see its results but have no idea how they're produced. Italy and Spain seem to have a few excellent groups and people, but those that I know seem not to want to talk about their systems as such. My impression is that they are somehow very dysfunctional, but I can't say just how.

     

    In North America programs vary widely. Some places have comprehensive-like exams, but sometimes they are early in the PhD program (often called 'preliminary exams') and sometimes late. Other places have other requirements instead of the exams, like big essays or book reports or something. I remember feeling that comprehensive exams were a good thing because they ensured that everybody who successfully made it away with a doctoral title would at least have a basic grasp of undergraduate topics. On the other hand I admit that written exams are an artificial task with little relation to any real undertaking, so it's rather silly to make them a necessary step towards getting a degree. Probably the best approach, as I see it now, is to have something like a comprehensive, but grade it very leniently, so that you're really only checking basic understanding, and not how fast the person can scribble under pressure.

  6. Do "real", "imaginary" and "singularity" mean roughly the same as they do in mathematics?

    Yeah, they're supposed to be exactly the same. The issue here is that replacing time with an imaginary number really changes the equations and their solutions, because time is not just a dependent variable in general relatiivity. In some cases the switch makes the equations less singular, but the question is whether that's just wishful thinking.

     

    In the 'flat spacetime' of pure special relativity, you have to think about a 'pseudo-Euclidean' geometry, where time differences show up with a weird minus sign in the Pythagorean theorem. Euclidean geometry is where if you have two points whose separations along two perpendicular axes are x and y, then the distance s between the two points is given by s^2 = x^2 + y^2. And that's still true in special relativity for spatial separations; but if now one of the two axes is time rather than space, then you have to use s^2 = x^2 - (ct)^2, where c is the speed of light.

     

    That's weird, but it's not too weird, because in special relativity time itself is still fixed. It's only how people describe time that may change, depending on reference frame. So time will so to speak hold still while you mess around with it. One way of messing around that can be useful for some calculations is to 'analytically continue' time, and consider time paths in the complex plane instead of just the real line. Some quantities are guaranteed to remain the same when you do this, but sometimes it's easier to calculate those quantities by taking a roundabout path in complex time. It's just a mathematical trick for calculating things.

     

    As a mathematical trick it's easy to see that if you rotate t into the complex plane, so that it becomes iz/c for some real z, then you can make x^2 - (ct)^2 into x^2 + z^2, which looks more like good old-fashioned Pythagoras. So you can sometimes try to calculate some things in relativity with complex time, just as a mathematical trick; and the trick can sometimes even be used to make time look just like space, if you want.

     

    Where Hawking kind of lost me was in extending this trick into general relativity, where the geometry of spacetime isn't just a fixed stage on which things are happening: now it's the play. Changing time to imaginary really changes things. It was never clear to me that those changes were right.

  7. I forgot to answer Triumph: I'm in physics. I finished grad school in Canada late last century, did some post-docs in Europe and the US, and now I'm a professor in Germany. The same is true of my wife, in linguistics, except that she was a tenured professor while we were in the US, instead of a post-doc. And I know people who've hacked some way through the academic jungle in the humanities. So on the one hand, between first and second hand, I probably have a broader experience of higher education than most people; but on the other hand what this has taught me is that academia is really diverse. I've only seen some of it, and I may well be quite clueless about the parts I haven't seen.

     

    What field are you in, Nikki?

  8. Master's degrees have been obsolete in North American in science for nearly 20 years, and I think that holds for social sciences, too. Basically any field that gets funding to pay graduate students for research. Often people formally enroll in a Master's program, but then just transfer into PhD once they've done their coursework, and never do the Master's thesis.

     

    The problem with the Master's degree is that to go from just-finished-coursework to a publishable research article in one year is a real stretch. One article a year is not a bad speed even for a post-doc in my field, and the Master's student is starting from zero with the current literature. They're probably not even yet capable of just picking up a current article and understanding it. So if I want to do actual research with a Master's student, I normally expect that it will be slightly more work for me than to just do it all myself, because I won't be able to count on the student doing a single darn thing independently, and I'll have to take time to explain things to them. Skipping the Master's phase, apart from coursework, cuts out the time-consuming step of writing a thesis that normally doesn't contribute to research. It lets the student just get on with learning their trade, maybe get their name on a paper for a little bit of work on it in their first year and otherwise working up to being lead author on an article after maybe two or three years.

     

    The humanities are different because normally the only way they have of paying students is for teaching assistance. And humanities departments usually have a lot of small seminar courses that they have to offer, with a lot of student papers to grade, so on the one hand they need a lot of grad student TAs, but on the other hand they can afford to keep paying them for years on end, because research grants are for a limited time but teaching goes on forever. It normally takes several more years to complete a doctorate in the humanities than it does in science, so you might as well break that up into two degrees. (Since there's so little research funding, the 'post-doc' phase is missing from the humanities' career path, and people can expect to apply for faculty positions as soon as they have their degree. The total training time is about equal to the sciences, but it's usually much more poorly paid.)

     

    Humanities research is also normally directed much more towards full books rather than short articles. Writing long documents is an important skill of the trade, and that's quite a thing to learn in itself; a 'junior' thesis is a good first step towards the big dissertation.

  9. I actually wrote my master's thesis (I'm one of the last North American PhD's to even do a master's degree first) using this imaginary time quantum gravity stuff of Hawking's. But only kind of; I was doing a particular kind of problem for which imaginary time is a standard technique when gravity isn't involved. Doing it with gravity is actually a major extra leap, but in my particular context it didn't seem quite so arbitrary. Hawking's version is whole-hog, this-is-how-it-should-be. That makes other people raise eyebrows. As far as I know, it has no good answer to why time normally seems quite different from space. And that's kind of a big thing, not to be able to answer.

  10. Dikiyoba has a point, but it can't quite be that simple. It's notoriously difficult to distinguish murder from killing in general, but the distinction can't really be purely formal. People do take stabs, as it were, at making it more objective. Don't criminal codes have to define murder? Surely they can't just say, 'Killing that's against this law is hereby forbidden by this law.'

  11. I read Captain Vorpatril a while ago, too. It was worth reading, a nice little story if you didn't know the whole saga, and interesting to see a few persistent minor characters get some time in the limelight, if you do know the whole saga. On the other hand it's pretty, well, tepid compared to the peaks of the series, like Mirror Dance, say.

     

    In some ways that's just what it has to be. The whole point of Ivan is that he doesn't have Miles's manic urgency. Even his big adventure is going to be a lot less edgy. But once the series gets round to Ivan's big adventure, you know you're in epilogue country.

  12. I read the whole book believing it was standard stuff, and now I have lost my copy, which particular topic was covered in that chapter?

    It's been a while since I read the thing, but I think it was the Euclidean spacetime stuff, where somehow in quantum gravity time isn't really any different from space at all. This leads to cosmology in which there isn't really any sharp 'Big Bang' beginning to time, or something. Far from being generally accepted, this is essentially just Hawking's own pet theory. It has a couple of cute features, but most people just find it bizarre and arbitrary. It hasn't fared particularly well in the years since that book was written, either. It never really caught on. Lots of people have looked for various alternatives to the Big Bang, in some kind of quantum theory, but there's no consensus that Hawking's version got this right.

     

    Arg: I don't seem to have kept my copy. I used to move every few years and purge my library aggressively whenever I did. I must have decided that although it was a neat book there was nothing in it that I didn't already know well, so I wouldn't ever re-read it, so away it went.

     

    So I can't check just what it was that I felt took a misleading turn.

  13. A Brief History of Time.

    It's a good book, except that you have to take the last chapter or so with a grain of salt. Hawking explains some stuff there that essentially nobody but he believes in, but he still gives the impression that it's as well accepted as the stuff in the previous chapters.

  14. Even when photorealism is possible there will be people wanting to do it with people. For art, if for no other reason. And audiences like real people behind their viewing too. Actors are themselves part of the story.

    I'm imagining that live theater will survive and even rebound. But movies are already so artificial. The sense that there's a live person behind the role seems pretty remote to me.

     

    I imagine there will still be identifiable actors who appear in multiple movie roles, looking slightly different; I think that's a convention people like. It helps set up a character by reminding people of all the other characters that actor has played; the actor is a sort of meta-character. I just don't see why the persistent meta-character needs to be a real human being. I think movie stars in the future will be like Mario and Sonic and Micky Mouse. Live actors may still work on films, but they'll be helping the modelers get the expressions right, not playing through every scene.

     

    Maybe I underestimate the durability of the celebrity industry. Maybe the fact that actors have off-screen lives is an irreplaceable element of movie marketing. But I think the celebrity industry is more flexible than that. It can make celebrities out of anyone and anything. Movie actors aren't the only candidates at all. And even Mario and Sonic can have off-screen scandals. Just pop them into a controversial YouTube clip. Sure, film studios will lose some of the real human interest that comes from movie stars having actual human lives. They'll also lose the drawbacks of having stars crack up or need rehab. I'm thinking that the computer models will win out.

  15. I think that's where I'm maybe just naive and idealistic. But the way I think is: the villain's shiny axe and freakish hair are cool, and it's hard to go back to a story without such things, once you've seen the 3D glory; but the cool graphics are still really icing on the cake, gravy on the meat. What will make people really enjoy your production, and remember it, will be the same things that made people enjoy stories in the cave around the campfire, and remember the legend thousands of years later. The basic substance of the story itself will still be what really matters. Everything else, by itself, is at most a fad that will quickly pass.

     

    On the other hand, let me argue on the other hand. All the good basic stories are basically simple, and already told many times. What matters is to get synergy between different elements of your story. There's a weird kind of threshold at which everything pulls together and the story's heart beats. It could be that the villain model's quirky hair is just the little touch that somehow picks up threads in the plot and theme and setting and makes it all click. Or at least it could be that the details of your models really work in creating a look and feel that sets everything else in a unique light.

     

    But all right, try to synthesize. There's a limit to how much a human mind can take in. You need to reach a threshold of synergy between all your elements, but past that threshold, returns diminish. The right thing to do for a real artist is to reach that threshold, then ship. Exactly what parts of your production are used to hit the threshold can vary a lot. Webcomics are maybe a great sort of herald medium for what I'm thinking about, here. Serial graphics are a much lower bar than 3D film, but you can already see how affordable software has drastically widened the bottleneck of manual talent. And some webcomics still work mainly on story and dialog, with primitive artwork, while others are beautiful tapestries with minimal prose and obscure plot. All of them can work.

     

    The element with the heaviest weight, in any extended production, is still the story. Great scenery can put a mediocre story over the top, but nothing can save a lousy story: people will just clip your single best post for a screensaver and stop following your comic. And a great story can easily carry some pretty crude artwork. So the skills of basic storytelling won't be the only skills, in the future, but they'll remain the most important skills. And while achieving magical synergy with elements apart from the story itself may remain as subtle an alchemy as ever, the obstacle of being able to achieve those elements at all will be much lower than ever before. So the skills of basic storytelling will have a wider scope for application than ever before.

     

    Alchemy will still be the Ars Magna, but a lot more people will be able to get into alchemy, when you can get antimony from Amazon at a click.

  16. I'm waiting for the day when an ordinary chump can tell Siri or Google, "Make me a villain with a big axe and really cool hair," and Poof! they've got a photo-realistic 3D model ready to be inserted in an action game or a movie. And in effect producing movies and top-grade games will be about as easy as writing a novel is, now. That is, it will be a lot of hard work to make a good one, but the tools and skills needed to finish one at all won't be the bottleneck.

     

    I doubt it will ever get quite that simple, actually. At least not in my lifetime. But I can imagine an awful lot of basic sound and graphics getting provided rather cheap by companies that cater to the amateur artist market. Siri probably won't make your villain that easily, but BitParts.com will, for twenty bucks. Sure, that particular villain will then be pirated all over the world; but I'm thinking that BitParts.com will do just enough customization of your villain, based on the background you provide about your game/story/movie, that you'll pay them for the customization, rather than taking a stock character for free.

     

    Along with the above I predict the eventual entire collapse of film acting as a profession, though I expect that the convention of having many different characters in many different stories all be Brad Pitt will survive, in the sense that recognizable base models and algorithms will appear in many places, and this will even be a selling point.

  17. BoA never really took off, and it always seemed to me that this was just because it was too much work to make a game with the more sophisticated engine. Jeff couldn't manage the work to eliminate the many bugs that didn't happen to hurt his own scenarios, and few designers were keen enough to finish a scenario. The Geneforge engine is fancier still than BoA. I think BoG would be a nightmare.

     

    I saw the same pattern with Escape Velocity — the 2D space combat and trading simulator games. They came in three successively more polished editions. All of them have lots of user-made patches and modifications, like extra ships and outfits. But the big deal with EV was 'total conversions', where someone re-wrote all the graphics and planets and items, and made a whole new story line that just used the basic game engine. For the original EV, which wasn't all that far above Pong in its look and feel, there were quite a few of these. There were also quite a number made for Escape Velocity: Override, the sequel, which had more sophisticated graphics. Escape Velocity: Nova had beautiful sprites, and subtle banking and accelerating effects, weapons glows and everything. Only a handful of total conversions were ever made, and to be honest only one real one was ever finished. Its author had started it in the original Escape Velocity, kept doggedly on with it, and finally just upgraded it. Moreover most of his ideas were just lifted from a bizarre pen-and-paper RPG that he and his friends had been playing for years. So it seems that literally nobody ever just sat down with EV:N and a big idea, and pushed it through to the end. EV and EV:O were amateur games, and the bar was set at an amateur level. EV:N was professional grade, and the bar was just too high for normal folks.

     

    I also know that the very thing I like about the Geneforge games would discourage me from writing a BoG scenario. Geneforge is a unique world with a unique flavor, and Jeff's story arc is a fascinating epic. I don't want to write a fanfic game, kind of like Jeff's but not as good. I'd want to make something more uniquely mine. Starting from the Geneforge system, with the Geneforge graphics, would probably just make it harder for me.

×
×
  • Create New...