Tuesday, June 30, 2009

A Closer Look at the Video Game Character Artist.......


If you love video games and want to pursue a career that allows you to design your own video games, and video game characters, then you should get on track to becoming a video game character artist. The first thing any successful video game character artist should know is, of course, what makes a great video game, and what kind of characters best suit those games. Second, you will need advanced technical training in graphic design and computer animation so that you can be prepared with the skills to see your ideas through.
There are degree programs and technical training available in video game design at colleges, universities and vocational schools across the country. Through these programs you can gain a thorough education in video game design, and character design that will prepare you for a great career in the growing field of video game design and character design. In these programs you will learn how to use state of the art computer hardware and software programs that will allow you to create multiple levels of video games from the initial storyboards to the final project. You will also learn to read and use basic computer languages such as C++ for writing video games, as well as Maya, 3DS MAX, or Softimage XSI for 3D creation and animation of video game characters.

Get some training


So you've got what it takes to become a video game designer? You can't just walk into a game studio and get a job though. You need very specific training offered at video game design schools. Check out some of these top rated video game design schools listed below that offer gaming degrees.
ITT Tech: School of Drafting & Design offers a very strong cutting-edge Digital Entertainment and Game Design degree. And with over 100 plus locations nation-wide, there's a good chance that you can find a school near you.
DeVry University offers something more towards the programming end of things with their hot Game and Simulation Programming program. This should really get your feet nice and wet. Take it Online, or check out the various campus locations.
Another cool program worth checking out is the Game Art & Design degree at the The Art Institute of Pittsburgh - Online Division. The nice thing is you take the course Online.
Are you ready to take it by the horns and ride? Perhaps video game design is the career for you. There really has never been a better time to get on board. Maybe you'll become the next hot video game designer.

What qualities do I need for game design?


There are some qualities that game companies will be looking for when hiring someone into their fold. First off, you really should like video games. You should be the kind of person that hears the theme song to a game in your sleep. These companies want to hire someone who knows what makes a game good, and what makes a game bad. They want someone who knows good level design vs poor level design. The only way to do this is by playing, and playing AND playing video games over and over again. There are a load of people who spend all their time playing these games. Why should they hire someone then that hasn't put their time into gaming, and doesn't have the same passion for games as others?Secondly, they are looking for someone with good problem solving skills. There are so many bugs and potential problems when designing a game. This ranges from collision detection to making things look convincing. You need to be the type who is willing to solve a problem even if it kills you. Hopefully this will not happen.
Finally, they are looking for someone who can do things in a crunch. The gaming industry works at a torrid pace. Yes, you need to be patient, but you need to be awfully efficient in what you do.

What is the Outlook for the video game designer?


Video games are big business. We're talking about a multi-billion dollar industry that has now outgrown many other industries in a very short period of time. Video games are no longer looked upon as kids entertainment. Walk into any video games outlet, and you will see 20-30 year olds checking out the latest versions of Grand Theft Auto or the Sims. With the recent releases of the Xbox 360, the PS3 and the Nintendo Wii, producing games will be harder and a lot more involved. Video game designer teams producing these games have ballooned in size to meet the demand of a next gen experience.
So what does this mean for you? Basically, there are a load of opportunities to get into the video game industry. Video game popularity and complexity is exploding, and so are the opportunities at game companies and studios.

How to become a video game designer,



Have you dreamt about working with the game developers on the next version of Halo?
Would you like to become a video game designer working on the Playstation 3, XBox 360, or Nintendo Wii? Well, you're not alone. Read a bit further, and we'll show you that you might not be that far away from your dream career. Learn how to become a video game designer.
One of the most exciting jobs out in the market today is video game designing. Many have the impression that it would take too much time and skills to accomplish this. Well reality check people...it does not! If you are into video games and love spending your leisure time (even your work time!), playing games then you have already passed the qualifying stages to becoming a video game designer. May it be for console, arcade, or PC; video game designing reaches all these platforms. Now, let me explain to you some of the basic principles to get you an idea of how to make this dream job work for you.Throughout the years, games have evolved from the simple Atari and first generation Nintendo graphics to the complex 3D and multi level games of the present. There are just no limits to game designing these days. This multibillion-dollar industry is ballooning and encompassing other industries at a pace no one ever thought was possible. A major factor to this can be attributed to the gamers who are in their 20's and 30's and have never stopped picking up on the latest games and game platforms. Walk into any gaming shop and you will find adults mingling with kids to check out latest releases and try them out...together! So combined with the teenage market, this raving monster called the Gaming Industry is eating chunks out of the adult and teenage market today.Need for video game design teams have increased dramatically because of this growth in the industry and the demand for better and more creative games by consumers. Therefore, the outlook for anyone trying to enter this industry is not bad at all. If you have the love and drive to create games then there is more than a lot of opportunities to do that. Become a video game designer and you could take part and even play a major role in producing legendary games such as Sims, Unreal tournament and Halo to name a few. Get a chance to team up with the best companies and game studios by becoming a video game designer. If there ever was a time to take that leap of faith and trust your gut instincts then it is now! A genuine love for games is one of the most important qualities a game designer must possess. For natural love for gaming spawns creativity and a drive to excel in producing games people will endear themselves to. You could even ask yourself a question. Why would I do a job that I do not have the talents and passion for? Now, if your answer is I am doing this because it is what I love to do, then you are the right man for this job.This natural passion, even though very important, is only one of the aspects that makes a good video game designer. Companies who hire designers are also looking for someone with good problem solving skills and who is inhumanly patient. There are so many bugs and potential problems when designing a game that it could drive someone mad. These problems range from collision detection to making things look convincing animation-wise. Many of these complex problems need to be resolved quickly and with ingenuity. You should always strive to improve and to innovate the game at any point of the production. This can be profoundly stressful and take up huge amounts of your time even to the point of exhaustion but the rewards are more than worth it. The experience brings a completely new meaning to the phrase sleeping on the job!Being a perfectionist is also a quality sought after by many of these gaming companies and studios. Making sure everything is up to the highest quality standards and done as efficiently as possible in as little time as possible are talents which are definitely sought after in this fast paced industry. Now, if you have all or even just some of these qualities then this job is just waiting for you out there... so go out and grab it.So now, you are ready to proceed and create games huh? However, wait, you just cannot walk right into one of these companies and apply for this job if you do not have the skills! Knowledge about such things as graphic designing, computer animation and game development is necessary here. You need very specific training on these and other elements in order to be a certified video game designer. Below is a listing of where you can get the best training in these areas of expertise. With no further ado, here are some of the premier video game design schools you will need to enroll in to become a top video game designer.International Academy of Design and Technology: The Bachelor of Fine Arts in Visual Communication (Game Design) is designed to provide training in principles and techniques used to create interactive 2D and 3D computer games. Students can learn design software; modeling and animation skills, networking principles, level and world editors, and game engines used to design and develop games, and will examine market research and business concepts related to game production and distribution processes. Project management, creative design, and communication skills are integrated throughout this dynamic curriculum to help prepare students for entry-level positions in the game design industry.DeVry University: Offers something more towards the programming end of things with their hot Game and Simulation Programming program. This should really get your feet nice and wet. Take it Online, or check out the various campus locations. ITT Technical Institute: The ITT Technical Institutes offer a bachelor degree in Digital Entertainment and Game Design. Courses in this program offer a strong foundation in digital game design through the study of subjects such as gaming technology, game design process, animation, level design, and general education coursework. In addition, with over 85 locations nation-wide, there is a good chance that you can find a school near you.

Monday, June 29, 2009

How To Tackle Work For Hire


One of the toughest questions for any startup developer is whether or not they should do work for hire. It provides revenue, which can look awfully appealing at times, but it can be a big distraction from the reason you started the company in the first place. A little backstory: Ever since we opened up our iPhone division, we’ve been flooded with requests to do work for hire. Not just iPhone games, but everything. Up until recently, I had dismissed offers of work for hire out of hand: I tended to consider contract work a pernicious trap.Of course, something strange started happening when such work came in in volume.We got work for hire offers that I actually thought were neat -- by which I mean we got work for hire offers that were interesting and compelling, independent of what they paid... work I’d be doing in house if I had thought of it first.Eventually, the temptation was too great and I caved and started taking some of these offers. Below I’ll detail what we’ve done to make contract work as positive as possible for the company, what we’ve done wrong, and what I feel is unavoidable.The ProblemI’ve seen many studios begin doing work for hire because they needed capital and end up losing focus. They become work for hire houses that die out in a few years without ever accomplishing what they started their studio to do. When I began taking on work for hire I told myself I had to do the following things:1. Distract as little as possible from the main endeavor.2. Keep employees satisfied with their jobs (not have them feel like they’re being relegated to ‘lame’ work for hire work).3. Have the work for hire more than pay for itself (this one sounds very basic, but I’ll explain below)4. Add value to the company.The AnswersEstablish a New Division: After the first project, when we decided we were going to really start doing work for hire, we established a new division of the company for it, wholly separate from the core development team working on the main project.This had the advantage of minimizing the distraction from the main project and allowed us to establish new standards and guidelines specific to the work for hire division.It also allowed us to offer a different pay scale for the work for hire team (they get less salary but they get time allotted to work on personal project which Divide by Zero helps publish and which they get the lion’s share of the revenue from, on occasion they also get part of the rev share from the projects they work on).When I say Divide by Zero is going to do something, I don’t believe in total subcontracting. I believe the quality of the work and the dynamics of the team are simply better built up over time in house. This has the disadvantage of keeping people on salary. This means that the division has a fixed cost per month, which either has to be covered by work for hire or we have to take a loss: which means that it’s takes a lot of will to remember to...Choose Work Carefully: Up until this point we’ve been an entirely equity financed company, this means that we aren’t relying on work for hire to survive. I’ve seen many studios which will just haphazardly take absolutely everything that comes their way, sometimes because they need it to survive, sometimes because they don’t know when other work will become available (and sometimes because they just get caught up in the whole contract work thing and don’t take a step back and look at why they are doing it). This is fatal.We’re very careful about what we choose to take: we probably reject work for hire offers on a 6:1 ratio. It was incredibly difficult to take a step back and say, “I’d rather take a loss on the work for hire division and give them more time to work on personal projects in any given month than take the wrong work.” But I believe it’s one of the best decisions we’ve made regarding this part of our business.Ask Who Wants to do a Project: One of the core requirements for us to take on any particular work for hire venture is to make sure that we have people who are excited and passionate about doing that particular work. Before I accept anything I go to the work for hire division and say, “We’ve got _____ coming up, who wants to work on it.” If I get back a positive response from a group of people with the required skill set and I believe they have the time in their schedule I greenlight the project, otherwise it’s a no-go, even if management is interested in doing it.Not only does this increase the quality of work we turn out (in my opinion), this keeps morale very high in the work for hire division and actually makes it easier to get things done. Anecdotally, people in our work for hire division psyche each other up and ask each other for favors. If a team is mostly formed but doesn’t have one of the people it needs, the people who want to do the project usually convince whomever they need to get on board. I rarely ever have to step in. Also, this is the only case where I’ll contract out. If we’ve got 3 people raring to go and they need one more person to complete the project I’ve had teams come to me and say, “I know this guy, I’ll get him to work for almost nothing and I’ll totally manage him. If he fails I’ll take responsibility.” In that case I’ll give it a shot. It’s worked out well for us so far (we’ve even hired someone out of a situation like that).Approach Companies: If you are going to do work for hire, biz dev is important. While a lot of work just comes our way, we’ll actively go seek work if we think it’s cool. Often we’ll be at lunch and someone will say "Wouldn’t working with X be awesome?" If it gets a resounding "hell yeah," then we go out and approach those people. Part of our core work for hire business is reviving dead IP (you’ll see in a few months…) and helping companies expand their IP into the interactive space. This allows us to continually work on projects that we think are compelling, but would be impossible if we just sat back waiting for business.Do The Math Right: Don’t try to overbid and don’t try to impress anybody by bidding low. Assume everyone’s acting in good faith and be okay when they walk. You’re not going to get every contract. Remember, everything’s going to run long or need some extra hands on it, there’s always unexpected changes in this sort of work, so budget for it. So long as you’re giving an honest estimate of what you think your team will require no one will fault you. A word of warning -- make sure to be very clear on what you’re being contracted to do. Many entities don’t understand what it takes to make games, and when dealing with contract work we find that often people change their minds or come to realize what they actually want only after the work begins. Don’t be afraid to let your client know when they want you to do something that exceeds what you originally agreed upon and will cost them more money.We usually recommend to clients that they commission us for a design spec first. This is less risk on both sides and is the cheapest way to attain clarity and make sure that everyone’s talking about the same project. It also gives both sides a clean break point to work with other vendors if we discover the project is something that we can’t really handle in-house (and if they do choose to walk, at least we’ve provided them with a document that they can take to anyone else to determine if they’re the right vendor for the project).Be Flexible: We’ve had some crazy stuff come across the desk when dealing with contract work... and crazy stuff is often the most fun (and profitable) to work with. We’ve specifically built the work for hire division around people with a broad set of skills who are willing to dive into anything and learn (who knew how handy a working knowledge of ancient Greek would be?).Additionally, as a business, you have to be flexible: if someone has a big meeting with their funder and needs a document done in two days you’ve got to figure out a way to make it work (assuming of course they’re willing to take on the cost associated with doing so), if someone wants some crazy payment structure, consider it. You want the opportunity to say no, not the necessity.Adding Value: We also look for work for hire that adds value to the company beyond what it pays. In some cases this means building out a tool that we can use in other projects, in some cases it means broadening our skill set, in others it’s establishing good relations with an interesting IP holder or even getting to experiment with mechanics we hope to have in our major release. Downsides: Work for hire is a distraction. No matter how much we’ve tried to mitigate and how much we keep the teams separate it takes a lot of management overhead.Preparing the contract, finding cool projects, making the deals, managing the relationships with clients, even doing strategic planning for two divisions instead of one requires a lot from the management team. Don’t delude yourself going in (we did, a little bit) -- work for hire will change how your business functions. It may offset your costs but it won’t fund any major project, you’re still going to have to raise capital to put out your AAA title.If you are going to do this, do it because it will allow you to employ a few people during a down economy. Understand that it will cost you something and hope that the benefits outweigh the costs.That doesn’t mean do it stupid: you’re a business and you want to maximize every opportunity you have. Be keenly aware of what contract work is doing to you as a company and be ready to back away if it ever takes you too far off course from your original intent for the company

The Accretive Player Character


Accretive player character" is a term of art in interactive fiction. It refers to a protagonist who has motives and abilities that the player doesn't understand or share the first time he plays through a game, but that he gradually learns over the course of many replayings.Perhaps the classic example is Adam Cadre's Varicella, a scheming palace minister in some alternate-reality Italian principality that combines the technology of the modern day with the ethics of Machiavelli and the methodology of the Borgias. The title character hopes to outmaneuver everyone else and end up with the regency. Varicella is a fastidious man with a protocol fetish -- not the strongest nor the most charismatic nor the most openly ambitious member of the court. But he does happen to know all the weaknesses of all his opponents and he's poised to take advantage of them all. The first time the player confronts this scenario, he lacks Varicella's keen political understanding, and inevitably loses. The second time, he's probably still not up to snuff either, but loses in a new way, for a new reason. But by the winning runthrough, he has Varicella's role down. He does know exactly what is happening everywhere in the castle, every minute of the day. He does know exactly whom to assassinate, when, and how. The player/protagonist gap has been erased by careful training.It takes a special kind of game structure to make the accretive player character work. The game has to play through fast, so that the player doesn't get frustrated with the prospect of retracing his steps. It has to be fun enough and deep enough to keep the player coming back despite the obvious frustration of losing many, many times. It has to give enough feedback and guidance that the player does learn from every losing step, and has a chance to make a new, better plan for the next runthrough. It has to offer significantly different experiences to the naive player and to the accomplished one who understands what he needs to do. Perhaps the hardest aspect of all: the game has to somehow communicate to the player, Look, it's all right, you won't get this the first time, but that doesn't mean the game is too hard for you. Stay around. You'll learn.Done right, though, the accretive PC is a hugely powerful device.Having the player learn by replay allows for a narrative arc that has no tutorial phase, no gradual ramping up; big things can start happening right away. The first several times the player goes through the game are the tutorial. Early playthroughs of Varicella involve a lot of wandering around the palace, meeting and getting to know people, spying out secrets. That's fun and interesting, and it lays the groundwork for later-- when Varicella has time for none of those things, and instead executes a ruthless plan. The final runthrough is a lean story with no futzing around, no time lost.The accretive PC also promotes player identification. The better the player understands the protagonist, the more persuasively he can play the role right from the start. This cuts both ways, because (I also find) if the player has spent hours upon hours working out the perfect plot for Varicella to execute, he's less likely to balk at some of the protagonist's more reprehensible actions. Jon Ingold's recent mystery Make It Good (also playable online) takes the same concept of the accretive PC and plays it in new directions. The protagonist is a down-on-his-luck cop, one drink from being kicked off the force, who is called in to investigate a murder.As the game progresses, it becomes clear that the player is going to have to do more than find evidence. He's also going to have to manipulate the suspects in order to bring about a satisfactory ending. Gameplay thus goes beyond interrogating suspects about keywords and looking under beds, into the realm of assessing and meddling in the psychology of the non-players. And these are alert characters, not the dull ciphers found in many games. They see what the player is carrying when. They observe how he acts. They notice evidence if it's left in plain sight, and sometimes find it when it isn't. They draw conclusions. They talk to one another, sometimes behind the player's back. All this makes for very difficult gameplay, and there are times when the implementation doesn't live up to the design challenges as smoothly as it might -- but the process when it works is compelling. To complicate matters yet further, the protagonist evidently knows, or half-remembers through the boozy haze, some important things. The full accounting of what he thinks is never fully available. While he doesn't exactly lie in his thoughts to the player, he verges on being an unreliable narrator.It's frustrating, not having a good clear access to what exactly the protagonist really knows. The player is stuck running around doing his best to look out for this character's interests without having full access to his memories or total control over his thoughts. And yet that too factors very effectively into the characterization: the player is coping with the constraints imposed by the narrative, while the protagonist is struggling with a long-term devotion to cheap whiskey. The player would like to know more. The protagonist would like control over his life. Gradually those two desires converge. In the winning playthrough, the player-protagonist finally achieves both agency and understanding.

Female Gamer Population Increasing On Consoles


The NPD says the number of female gamers in the audience is up year over year -- "significantly" on consoles.The percentage of console gamers who are female grew from 23 to 28 percent in 2009, according to the NPD's new market segment study. The rise is mainly attributable to the Wii, which itself saw a 19 percent increase in usage over last year. But even in more core-market outliers, female presence is up, the study results assert. The audience defined as "heavy" users of portable systems saw its percentage of female constituents rise four percent.The portion of the audience defined as "extreme" gamers -- those who play an average of 39 hours per week -- also saw its female population rise four percent. "Last year was one of the most transformative in history in terms of defining the audience for gaming," says NPD analyst Anita Frazier. "Even with the increased competition from mobile and social network gaming, the console gamer segment added the most new participants to its ranks in the last year."38 percent of total gamer time was spent playing online games, finds the NPD -- flat compared to the previous year, suggesting that as the industry eagerly moves online, the audience may not be following at quite the same rate.In fact, the NPD says that while 16 percent of game purchases in 2008's fourth quarter were downloaded digitally, the average number of gamers paying for microtransactions decreased over last year. "This could be caused by the increased availability of free gaming, putting a downward price pressure on the industry," says Frazier.The NPD's study is based on a survey of 20,000 members of its online consumer panel of people "ages two to 65 plus" who say they personally play video games -- responses for two-year-olds, it says, were captured by asking parents to help their kids use the computer in order to answer the questions

Nintendo the great position in the console and portable business


In past game console cycles, sales dipped in the transitions. Software sales in the U.S. were down 9 percent in 2000, for instance, and 3 percent in 2005. The current generation of consoles began when Microsoft launched the Xbox 360 in November, 2005. Portables, meanwhile, saw a renaissance that began in 2004 with the launch of the Nintendo DS. The launch of the Wii and the DS Lite in 2006 changed everything in Nintendo’s favor. Since 2005, overall software revenues have grown 60 percent, largely because of Nintendo’s success.Nintendo’s Wii is expected to be the dominant console, with an expected 49 percent share of the U.S. and European markets at the end of 2009, compared to 29 percent for the Xbox 360 and 23 percent for the PS 3. But by 2011, Sony is expected to pull even with Microsoft in the No. 2 spot. Of the $44 billion worldwide sales in 2008, about 40 percent, or $17.7 billion, was hardware sales.

Hardware sales of consoles are tracking about the same as they were in the last generation of consoles. To date, high sales of the Wii are offsetting slower unit sales of the PS 3 and Xbox 360. The report says there were roughly 78 million current-generation consoles sold worldwide as of year-end 2008, with 16 million PS3s, 23 million Xbox 360s and 39 million Wiis sold. That compares to 78 million legacy consoles sold worldwide at year-end 2003, with 54 million PS2s, 12 million Xboxes and 12 million GameCubes.The report expects consoles per gaming household to stay constant at about 1.4.
Nintendo has essentially turned the tables on Sony. It succeeded in grabbing the mass market, while the Xbox 360 and PlayStation 3 snared the hardcore gamers with high-definition TVs. Nintendo started out with a low-priced console aimed at party gamers with the added twist of the unique Wii motion-sensing controller.The Wii enabled much earlier and broader participation of the mass market than would otherwise happen in a console cycle.
Wedbush Morgan predicts that the console cycle is likely to last beyond the typical five-year mark, which begins anew in 2010. While Nintendo may introduce a high-definition version of the Wii, the console makers are not likely to be motivated to introduce new machines in the near future.
Costs are going up, but digital distribution has promise. The cost of making games has gone up. In the last generation, the average console game required 18 to 36 months to finish and cost an average of $4 million. In the current cycle, console games take 24 to 36 months to develop. Average development costs are now $8-10 million. Those costs are starting to subside as developers learn how to make games better.
Online games, casual games and mobile are expected to grow steadily, while downloadable content and in-game advertising will really pick up once the consoles hit a higher penetration of overall households. Activision Blizzard is an exception, since a quarter of its revenue and half of its profits come from the online game World of Warcraft.
Online subscription and game-related downloads are expected to be a $3 billion opportunity as video games on demand services such as OnLive are launched. Wedbush Morgan predicts that OnLive will be a breakthrough success and will be widely adopted. But it will be a long time before such services take a big chunk of the market away from retailers.
Asia, meanwhile, is likely to be the market where massively multiplayer online games such as WoW take off. There are an estimated 10 million MMO players in Asia, generating monthly revenues of $500-700 million. Free-to-play games, supported by the micro-transactions such as the sales of virtual goods, are also a small but fast-growing part of the game market.
Digital downloads will eventually become a big market, but for now they are hampered by small hard drive space on the consoles (20 gigabytes to 80 gigabytes on an Xbox 360).
The iPhone and in-game ad opportunities are over-hyped. Cell phone games are roughly $2 billion overall, largely from downloads on older phones. But cell phone games are not expected to close in on console or portable game sales levels for some time, largely due to the hardware limitations of the devices. Cell phone games are viewed as a path to making casual gamers, particularly women, into more active gamers. But the market in terms of dollars is still small, since the average paid game is about $2, and overall iPhone game revenues in the first year are about $400 million.

Games are the fastest growing entertainment


In 2008, worldwide hardware and software sales were $44 billion, while US hardware and software sales were $19.5 billion. That compares to the combined US movie box office and DVD movie market of $22 billion and entertainment-related book sales of $9 billion. Right now, games are about 15 percent of the overall $75 billion in entertainment spending in the US. That percentage is expected to grow, since games are the fastest-growing entertainment sector. A reason for the growth is that gamers continue to play games even as they get older; the average age of gamers is now 35, compared to under 20 in the 1980s. There are also many more female gamers, who now account for 35 percent of the console market. In a separate report today, market researcher NPD said that females are a growing part of the console audience.
Beyond 2011, revenues from non-traditional sources — online games, casual games, mobile phone games, downloadable content, and in-game advertising — are expected to contribute meaningfully, offsetting slowing growth of packaged goods software sales, the report says. Over 10 years, digital content is expected to account for almost all of the game industry’s growth.

Games will see strong growth over Next 3 years


Game publishers should be able to grow revenues 9 percent a year in the U.S. and Europe as they embrace digital content and find ways to extend the consumption of games for the current generation of game consoles, a new report says.
The market for console game software is expected to grow at a compound annual growth rate of 9 percent from 2009 to 2011, while hardware revenues will decline 25 percent a year due to falling prices, according to Michael Pachter and Edward Woo, the analysts who created the report for Wedbush Morgan Securities. All other entertainment is expected to grow zero to 2 percent in the same time frame, the report says. That’s a bullish report, given that game industry growth has turned negative in the past few months as the recession slows down demand

The 210-page report details the size of the game industry as well as major trends within it and how they affect startups trying to find opportunities in the digital content portion of the game industry. As such, it provides some good insight into several game-industry issues I discussed in one of our most popular posts about the game business.

The Sega Nomad 1995


The only reason this is not more well known as the most misguided piece of hardware in gaming history is because it came out around the same time as the Sega Saturn, their follow up to the Genesis. The Saturn was a more high-profile failure that almost certainly lost Sega more money, but at least you could play an entire fucking game on it. The reason that's relevant to the Nomad will be clear in a moment.
At first, the Nomad seemed like the best idea ever. Already the Sega Genesis was hugely popular, and instead of making a whole different handheld game, Sega just made a portable Genesis. It even played the same games, you just shoved the Genesis cartridge right in there.
It came out several years after Nintendo's Game Boy with its crappy black and white screen. So how could this not take the handheld world by storm?
It was more than twice the price of a Game Boy around $270 in today's money. And while it played Genesis games, it only did so for about an hour and a half before all six of your AA batteries died. So, yeah, it would blow through five bucks worth of batteries every time you fired it up, and you'd barely have time to get past a couple of levels.
Sure, you could order a rechargeable battery pack, for about 80 bucks more $110 in today's dollars. But strangely enough, after spending the equivalent of $380 on this thing, it still wouldn't blow you. Oh, and the rechargeable battery ran out even faster.
You also couldn't tilt, turn, drop, shake, bump or otherwise move the Nomad lest it freeze or just quit working altogether. Sega considered adding a feature where they would drive to each customer's house and punch them in the face, but couldn't decide what to charge for it.

NEC SuperGrafx 1989



Let's face it, most video game consoles are failures. But few fail on the scale of the SuperGrafx.
Some of you may have owned the TurboGrafx 16 console, but may not know that the Japanese version (called the PC Engine) utterly dominated in that country, beating even the Nintendo NES in its day. Cocky from their success, NEC decided to beef up the TurboGrafx console and released the SuperGrafx in 1989. The beefed-up machine had better graphics and CD-ROM capability with an add-on. And the games were only $110 each! Or $170 in today's money!

Uh, yeah. It was bad enough that the system itself was $300/ $540 adjusted for inflation, CD-ROM not included.
All in all, only five games were ever made specifically for the console it could also play the original TurboGrafx games. The sad thing was, you still couldn't get the whole library without spending enough money to get you a beat-up used car instead.

The Vectrex 1982


Considering all of the me-too ripoff consoles that seemed doomed to failure based on pure lack of effort, the Vectrex looks like a piece of engineering genius.
Look at it, the thing comes with its own built-in screen! The controllers have little joysticks and buttons, just like the more modern control pads! What could go wrong?

Well, for the graphics, it used line-based two-color vectors instead of the colorful raster graphics everybody else used. So where Atari's Pole Position looked like this.

Keep in mind, the second pic is from a console that came out five years later. It was an advanced machine in every other sense (some of the all time great arcade games used vector graphics, like Asteroids) but come on. Two colors?
Vectrex, being the top engineers they were, came up with a brilliant solution to this problem: include several thin sheets of colored plastic with the system, so people could just lay them over the screen to give it some color. No, seriously.
It was at that point the gaming public stood up, pointed out the door and said, "Vectrex, get the fuck out of my office."
Just a year later, the video game industry crashed and took the Vectrex with it.

The Epoch Cassette Vision 1981


The Epoch was the first home console made in Japan. That's right--all of the systems we've mentioned up until now were manufactured in the USA.
The Cassette Vision came out a full two years before Nintendo would enter the scene, so Epoch basically had no competition.The Japanese really weren't into playing video games at home. They loved their arcades (Space Invaders was so popular it literally created a coin shortage) but none of the American game consoles had caught on.
Epoch had a unique vision of fixing this by bringing Japan a console that was quite a bit shittier than others on the market. One big problem was the controllers. You know when you're playing video games and you're leaning left and right with the controller, throwing it in anger, all that?
Well, if you did that with the Epoch, you'd be out of a system, seeing as how the controllers were just knobs built onto the console itself.
Imagine booting up the ol' Xbox and then holding it on your lap as you play. That's what the Epoch was. Keep in mind this thing came out five years after the Fairchild up there and its handy wired controllers.
So the thing sold horribly and then, in 1983, Nintendo released the Famicom (the NES to us) and the Epoch Cassette Vision was forgotten forever.
Epoch, however, struck back with the Cassette Vision Junior. They apparently figured their first system failed because awestruck consumers were intimidated by its technical prowess, so they corrected this by making the Junior much smaller and shittier than the original.
At this point consumers in Japan made it clear that if Epoch ever tried to release a video game system again, they would burn down their office.

The Fairchild Channel F


The Channel F was designed by Jerry Lawson using the Fairchild F8 CPU, invented by Robert Noyce before he left Fairchild to start his own company, Intel. The F8 is very complex compared to the typical integrated circuits of the day, and had more inputs and outputs than other contemporary chips. Because chip packaging was not available with enough pins, the F8 is instead fabricated as a pair of chips that had to be used together to form a complete CPU. The graphics are quite basic, although it is in color which was a large step forward from the contemporary Pong machines. Sound is played through an internal speaker, rather than the TV set.
The controllers are a joystick without a base; the main body is a large hand grip with a triangular "cap" on top, the top being the portion that actually moved. It can be used as both a joystick and paddle (twist), and not only pushed down to operate as a fire button but also pulled up. The unit contains a small compartment for storing the controllers when moving it: this is useful because the wiring is notoriously flimsy and even normal movement could break it
This was released in August 1976 under the incredibly vague name "Video Entertainment System." Predictably, this name turned out to be too similar to another console, the Video Computer System released by Atari around the same time, so Fairchild was forced to change the name to the Channel F. Atari then changed the name of their system to the Atari 2600, but Fairchild didn't bother changing theirs back because, really, what would be the point.
The Channel F was groundbreaking in that it was the first console to use cartridges with the games on them (it came out before the Atari, and everything up to that point had been like the Pong machines, where the games were programmed into the console and those were the only ones you could ever play.
It didn't help that the wiring in the Channel F was apparently as brittle as uncooked spaghetti. Also, while the controllers were the closest thing to a wired Wii-mote at the time in shape at least, they didn't have a base. That meant you couldn't rest it on your stomach along with your drink and a bag of chips.
Fairchild, not content to lose to Atari, came back with the Channel F System II in 1979. A whopping six games were released before it was put out of its misery.

The 5 Most Retarded Gaming Consoles Ever Released


Most gamers remember Atari's Pong for getting the whole video game console party started back in 1975. Most do not know that Pong was stolen from an even earlier console, the Magnavox Odyssey (Pong was a ripoff of their Table Tennis game). Magnavox even sued over it, and reached an out-of-court settlement in their favor.
But nobody bought the Magnavox Odyssey, mostly because a primitive public that was frightened and confused by electronics thought that the Magnavox game console would only work on Magnavox televisions. So Atari utterly dominated the industry instead, delighting an extremely easily entertained nation.

A bitter Magnavox was looking to score some more of that Pong cash that everyone was rolling in. So in 1975 they cranked out two new versions of the Magnavox Odyssey: the 100 and 200.
Magnavox really wasn't thinking big, and figured that Pong would be the only video game ever. So they released consoles dedicated to very slight Pong variations, amusingly called things like TENNIS and HOCKEY the two that were on the Odyssey 100--the 200 model had a third game.

The Next Generation of Gaming Consoles


We're still a long way away from seeing the next generation of Xboxes, PlayStations and whatever Nintendo has up its sleeve, but Ubisoft is already preparing for the future.
"The next generation is going to be so powerful that playing a game is going to be the equivalent of playing a CGI movie today," predicts Yves Guillemot, chairman and CEO of the publisher.
Next generation, estimates Guillemot, top tier games will likely average $60 million to make.
The ramifications for that are unknown. It could mean higher retail prices or lower return on investment. Ubisoft hopes to supplement the cost by reusing assets in the film community as it is currently doing with its game adaptation of James
Microsoft is the most ambitious of the console makers, saying it plans to market the launch of its recently announced "Project Natal" motion-sensing controller as prominently as it marketed the launch of the Xbox 360. In essence, Natal’s launch will be a next generation launch of sorts.

Games consoles seventh Generation


In the history of video games, the seventh generation primarily focuses on the consoles released since 2005 by Nintendo, Microsoft, and Sony.1
For home consoles, the seventh generation began 2 on November 22, 2005 with the release of Microsoft's Xbox 360 and continued with the release of Sony's PlayStation 3 on November 11, 2006, and Nintendo's Wii on November 19, 2006. Each new console introduced a new type of breakthrough technology. For example, the Xbox 360 and PlayStation 3 offered high-definition graphics and the Wii focused on integrating controllers with movement sensors instead of using joysticks 3 the PlayStation 3 also employs motion sensitivity, but to a lesser degree. Most of consoles have wireless controllers, while the Xbox 360 also has wired controllers as an alternative. The PlayStation 3 controller can be charged through the use of a USB-A/mini-b cable. The wireless Xbox 360 controller uses either a rechargeable battery pack or 2 AA batteries; the same can be said about the Wii.
For handheld consoles, the seventh generation began on November 21, 2004 with the North American introduction of the Nintendo DS as a "third pillar", alongside Nintendo's existing Game Boy Advance and GameCube consoles.4 The Nintendo DS features a touch screen and built-in microphone, and supports wireless IEEE 802.11 (Wi-Fi) standards. 5 More recently, the new DSi features many new things including two built in cameras, the ability to download games from the DSi store, and a web browser. The PlayStation Portable, released later the same year on December 12, 2004, followed a different pattern. It became the first handheld video game console to use an optical disc format, Universal Media Disc UMD, as its primary storage media6. 7 Sony also gave the PlayStation Portable robust multi-media capability,8 connectivity with the PlayStation 3 and other PSPs, and Internet connectivity 9. 10 The Nintendo DS likewise has connectivity to the internet through the Nintendo Wi-Fi Connection and Nintendo DS Browser, as well as wireless connectivity to other DS systems and Wii consoles. Despite high sales numbers for both consoles, PlayStation Portable sales have consistently lagged behind those of the Nintendo DS

Survival horror



Survival horror is a subgenre of action-adventure video game inspired by horror fiction. These games make the player vulnerable by providing them with less ammunition and fewer heavy weapons than other action games. Although combat is a part of the gameplay, the player must ration ammunition by evading enemies and avoiding direct confrontation. The player is also challenged to find items that unlock the path to new areas, and solve puzzles at certain locations. Games make use of strong horror themes, and the player is often challenged to navigate dark maze-like environments, and react to unexpected attacks from supernatural monsters.
The term "survival horror" was first used for the original Japanese release of Resident Evil in 1996, which was influenced by earlier games with a horror theme such as Sweet Home and Alone in the Dark. The name has been used since then to describe games with similar gameplay, and has been retroactively applied to games as old as Haunted House from 1981. Starting with the release of Resident Evil 4 in 2005, the genre began to incorporate more features from action games, which has led game journalists to question whether long-standing survival horror franchises have abandoned the genre. Still, the survival horror genre has persisted in one form or another, with several critically acclaimed titles in recent years

Survival horror refers to a subgenre of action-adventure video games which draws heavily from the conventions of horror fiction.1 .2 The player character is vulnerable and under-armed,3 which puts emphasis on puzzle-solving and evasion rather than violence 4 Games commonly challenge the player to manage their inventory 5 and ration scarce resources and ammunition.3
While many action games feature lone protagonists versus swarms of enemies in a suspenseful environment,6 survival horror games are distinct from otherwise horror-themed action games. 8 Rather, they de-emphasize combat in favor of challenges such as hiding and running from enemies, and solving puzzles.6 Still, it is not unusual for survival horror games to draw upon elements from first-person shooters, action-adventure games, or even role-playing games.1The origins of the survival horror game can be traced back to earlier horror fiction. Archetypes have been linked to the books of H. P. Lovecraft, which include investigative narratives, or journeys through the depths. Comparisons have been made between Lovecraft's Cthulhoid Old Ones and the boss encounters seen in many survival horror games. Themes of survival have also been traced to the slasher film subgenre, where the protagonist endures a confrontation with the ultimate antagonist.1
Some common elements of survival horror games can be found in the 1981 Atari 2600 game Haunted House. Gameplay was typical of future survival horror titles, as it emphasized puzzle-solving and evasive action, rather than violence.4 The game made use of monsters commonly featured in horror fiction, such as bats and ghosts which each had unique behaviors. Gameplay also incorporated item collection and inventory management, along with areas that are inaccessible until the appropriate item is found. Because it has several features that have been seen in later survival horror games, some reviewers have retroactively classified this game as the first in the genre.5 Many of the features of the genre could also be seen in the 1989 release Sweet Home, for the Nintendo Entertainment System. Gameplay involved battling horrifying creatures and solving puzzles. Developed by Capcom, the game would become an influence upon their later release Resident Evil, making use of its mansion setting and its "opening door" load screen

The Consensus Conduit ...............


The Wii certainly has the market cornered on minigame collections, and games where you take care of virtual pets... but one genre doesn't get a whole lot of play on Nintendo's machine: the first-person shooter. Now, animal husbandry is great and all, but sometimes you just want to make with the bang-bang. Sega and developer High Voltage Software give us that opportunity with The Conduit. The Wii certainly has the market cornered on minigame collections, and games where you take care of virtual pets... but one genre doesn't get a whole lot of play on Nintendo's machine: the first-person shooter. Now, animal husbandry is great and all, but sometimes you just want to make with the bang-bang. Sega and developer High Voltage Software give us that opportunity with The Conduit.The game's X-Files-esque plot mixes the old "aliens attacking Washington D.C." bit with a healthy dose of conspiracy theory. Secret organizations and mysterious diseases brandish spooky sounding names like "the Trust" and "the Bug," respectively, and you end up forming an alliance with a terrorist (or is he really the good guy?!) named Prometheus. The story tends to take itself a bit too serious for a game that involves killing aliens while running around in the White House, and the fact that it's only advanced through voice-only radio chatter doesn't make it any more interesting.
Still, who plays an FPS for the plot? The main draw is the action. If you're one of those people that checks out the options menu before diving into a game, you'll see that High Voltage went to considerable lengths to ensure that The Conduit takes full advantage of the Wii's unique controller capabilities. The default controls work great, with the nunchuk being used to move and the Wii-mote to aim. Even the motion controls work well (shake the nunchuk for a grenade; shake the Wii-mote for a melee attack). If the standard controls don't suit you, they're freely adjustable. Not only that, but darn near everything else about the game can be rearranged -- from the placement of HUD elements to how far from the center of the screen you have to aim your cursor before the camera starts to pan.Given the complexity of the control settings, I was a bit surprised to see just how basic the actual gameplay is -- the bulk of the stages are fairly dull. Linear maps frequently drop the game into a "enter a room, shoot all the bad guys, walk to the next room, repeat" rut. Not to say that a simplistic FPS can't be fun, but the action tends to get a bit repetitive, even considering the game's relatively short campaign mode.For a bit of added variety, you're given control of the All-Seeing Eye (ASE), an orb-shaped technological doodad (read: glorified flashlight) that allows you to see hidden doors, objects, and puzzles. Unfortunately, its implementation makes it feel less like a valuable tool and more like a forced gimmick.
I can forgive these flaws, though, thanks to The Conduit's fantastic online multiplayer mode -- which supports up to 12 players per match. The maps are well-designed for fragging, and the ASE is only used to play this game's version of "kill the guy with the ball." The Wii's dreaded friend codes are present, but they're only used if you want to specifically play with your buddies. If you don't mind shooting strangers, you can immediately dive into a game with random players from your local region or around the world. The Conduit offers a good variety of different multiplayer modes, and all of the matches that I played were refreshingly lag-free. The game even supports the Wii Speak accessory, which is great for all those fans of both FPS games and Animal Crossing (yeah, I suppose you could just buy the mic separately, if you'd like).Wii owners have been waiting a while for a top-notch FPS to call their own, and although the single-player mode has its faults, The Conduit's multiplayer experience helps to lessen the pain. Hopefully, other companies follow Sega and High Voltage's lead. Who knows? Maybe one day the Wii will have as many quality first-person shooters as it does Petz games.

Something Old Something New in StarCraft 2


On a fundamental level, StarCraft II is simply the original StarCraft remade with 3D graphics. After all, you can't (or shouldn't) reinvent a game with such a dedicated competitor base -- just imagine a baseball 2.0 where all the rules were changed. The sports analogy isn't too far off from the truth; StarCraft is one of the most competitive e-sports in the world, and it's been called the unofficial national sport of South Korea. But while the basics of the game have remained intact to keep from alienating the devoted player base, the changes are sweeping enough to make any StarCraft veteran take pause and get used to what's new.
On a fundamental level, StarCraft II is simply the original StarCraft remade with 3D graphics. After all, you can't (or shouldn't) reinvent a game with such a dedicated competitor base -- just imagine a baseball 2.0 where all the rules were changed. The sports analogy isn't too far off from the truth; StarCraft is one of the most competitive e-sports in the world, and it's been called the unofficial national sport of South Korea. But while the basics of the game have remained intact to keep from alienating the devoted player base, the changes are sweeping enough to make any StarCraft veteran take pause and get used to what's new.I recently had the chance to play the StarCraft II multiplayer beta, which you can sign up for by logging into Battle.net and opting in. Like its predecessor, SC II presents three playable species, an innovative control interface, and Blizzard's signature quality in graphics and audio. The changes are more subtle than something like a new playable faction, but profound enough to have a major impact on the game.

Old games



As most of you know, I primarily work with the latest tech (or at least the latest I can work with on my little Radeon 9600), and immensly enjoy working on shaders, render to texture, and other such spiffy things. And, as a Mac user, I focus on OpenGL work, keep up with the latest OpenGL news and such (Long Peaks!), etc. So, what games do I like to play? With an immense interest in the latest and greatest in crossplatform rendering, do I eagerly play the latest games with the prettiest graphics?Eh, no, not really. In fact, to be perfectly honest, I don't own a game made after, oh, 2003-ish.Ya, it's the original Command and Conquer. Someone put it up on GGE, and I got out the old game discs to see if it still worked under Mac OS 10.4.8 (It was written back in the days of 7.5.3). I figured, what, play a couple of levels just to see if it still works, awesome if it does, meh otherwise.Ya, just a couple of levels, sure. So now I'm about halfway through the GDI campaign...I miss games like this. The learning curve for Command and Conquer is about 15 minutes. As far as RTSs go, it is dead simple. But it's still incredibly fun and engrossing.Anyhow, I'm sure you're all more interested in what I'm doing with Torque than what random 12 year old game I'm playing.Right now, I have three primary goals1) Get all of the various features in (I'm a feature creep, I need to not feature creep, but the features are so useful they border on/are necessary)2) Make it crossplatform and consistent. Everything should work on Mac and Windows (maybe Linux if there's anyone with a lot of experience with Linux, Torque, and OpenGL), and the code should follow the same general patterns. For example, at the moment, VBOs are tracked by a linked list, but Shaders are tracked by a Vector. Bad me!3) Document it until I want to jump off a bridge. Then document it until I do jump off a bridge

History and Gaming




“Oh no! Jane has dysentery and I lost a wagon wheel!” Strange statement? Maybe, but not if you’re playing Oregon Trail, a game meant to teach students about the lives of pioneers in the 1800’s, once popular in elementary school classrooms. While Oregon Trail may be more popular on Facebook now than it is in the classroom, using games to teach history is still a serious subject for some scholars.
Case in point is Kelly McMichael, who presented History Gaming: The Texas Course Redesign Projectat the poster session of the 122nd Annual Meeting. Those who stopped by her poster session booth were treated to a take-home CD that included two games, a teaching guide, and a lesson. The two games, “CSI: Fomenting Revolution, Philip Nolan, 1801” and “CSI: Roanoke,” are both role-playing games, allowing students to navigate their own course through the game, clicking on certain elements to obtain more information. Even the included lesson plan (on the American Revolution) is interactive, containing video and interactive flash elements.

McMichael is the director of the Center for Educational Gaming in the Humanities (CEGH) at the University of North Texas. Visitors to the CEGH web site will find the “CSI: Roanoke” game and learn more about using games in the history classroom. The “Why Game? ” section of the site explains that games can engage students in “highly difficult tasks that are intrinsically motivating, require the acquisition of new knowledge and skills, and reinforce through a supportive environment and self-reflection.”
While the CEGH web site still contains a number of sections that are under construction, it does offer excellent resources like elaborate “EDUzines.” These PDFs contain articles, interactive flash sections, links to online resources, and more. The site also offers an announcement seeking contributors (specifically faculty or graduate students living in Texas) for a grant project “to create course content for the U.S. History Survey Course, 1877 to the present.” This announcement is also interesting because it provides a link to previously created course that contains examples of interactive gaming, like “Articles of Confederation vs. the Constitution” (in the “7. Problems Facing the New Country” section of the course) and “Who Wants to be a Historian?” (in the “11. Summary” section of the course).

Indiana Jones and the History of His Videogames



The stories of an adventurous archaeologist with a hat and a whip. Indiana Jones is one of the most well known characters in film history, the theme tune has worldwide renown and the famous boulder scene is one of film’s most iconic -parodied, acknowledged and recreated uncountable times.
A man who swings across chasms with a whip, shoots Nazis, explores temples and jumps between moving trucks; he’s a badass and his adventures transition perfectly to video games. Whether you’re pointing and clicking around a freshly excavated tomb, fighting skeletons with your trusty whip or asking “which collection of pixels is supposed to represent Indy”, Harrison Ford’s character has appeared in a colossal ten games, with another two on their way – that’s three times the number of films!
With the fourth film in the series, Indiana Jones and the Kingdom of the Crystal Skull, gearing up for the silver screen this week, we thought it was the perfect time to take a trip down memory lane. A lane filled with booby traps, Nazis and fine leather jackets…

A brief history of Gaming



So, welcome to my blog. I decided I'll begin by summarizing my gaming history to help guide my future articles. I'll be dedicating an entry apeice to why I loved, and hated each game. Occasionally I may post about the progress of the game I'm developing, as the whole purpose of dissecting other games is to help me improve mine.
So, 2001 (not a space oddyssey), I joined RuneScape. The game captivated me for years, and to this day remains the game with my longest /played. I levelled as a mage/miner and reached the top of the pile by a large margin, I was so far above the average level that in Castle Wars (when it was finally added) I could typically march into the opposing castle, kill whoever I wanted, and not die for entire games. I loved the small, close-knit community of the game, and relished exploring a world with virtually no information available about its workings. Fansites had not yet been made, the Knowledge Base didn't exist, and I had no experienced friends to tell me everything about it. But as the game grew, the unknown within it shrank, and eventually, I left for greener pastures.


his is me. Vetran RS-C players will remember that there was no penalty to magic for using melee armor and weapons, and that you could fight with both magic and a sword simultaneously. This has changed.
So, I joined MapleStory. The tutorial almost convinced me to quit and look elsewhere before I had even tried it properly, but once I left the tutorial and got out into noob island, I fell in love with the gameplay. I had come expecting more of the same, click to move, click to kill, autoattack anything to death. What I found was a Multiplayer/Coop Platformer with some cRPG elements on the side. Levelling quickly, I soon left the starter area and set out to become a Bowman (note: this is a bad choice, the other classes are much more effective). Anyone who has tried MapleStory can probably guess what led me to quit.
Next, I went to WoW. I was wary, the game had stolen away many of my online friends. I rolled a Human Warrior (Dimonorra, Area 52) on a random server. That night I played until the sun came up, so thrilled I was with the quality of the combat system and the graphics. Then I found out you couldn't switch servers whenever you wanted. This led me to reroll onto my friends' server. Triessa, Windrunner, was my first character to level 70, but I hit max level just before WotLK and thus never got to try the endgame due to lack of interest from the rest of the population.
Since endgame WoW had been a bust, I wandered between Yohoho! Puzzle Pirates and Flyff for a few weeks. Flyff was awful, combining everything I hated about every game I'd ever played into one. Y!PP was a good deal more fun, but grew stale once I had tried everything. As "incredible" as I was at sailing, it simply wasn't fun.
So I drifted back to WoW a month late for the expansion. This was a blessing in disguise. Initially, I was annoyed that, yet again, my friends were all endgame raiding, and I was stuck killing 30 boars. But really, I was lucky, I missed the mad scramble for mobs to kill that had followed the level cap increasing to 80. Still, I became fed up with the glacial (no pun intended) pace of levelling through Northrend, and rerolled a Death Knight (Desmentia, Windrunner). I blazed to level 80 in two weeks, gearing as a tank along the way. Since then I've raided all the current content, clearing Naxx, VoA (both bosses), EoE, OS+2, and most of Ulduar.

Gaming Technology "How"



Unless you don’t watch TV (in which case, why the heck are you surfing the Internet?), you’ve probably seen all the commercials out there for Afro Samurai. Even if you aren’t really paying attention, Samuel L. Jackson’s infamous bad-ass voice is likely to pique your interest at least a little.
This game is available on the PlayStation3 and Xbox 360 platforms, and thus far the reviews seem to be good. People are give an average score of 8-8.1 out of 10. I have to confess, I haven’t played it, and I have no intention of doing so — well, probably, who knows. I’m just generally not into the whole samurai bit. Why write about this game, then? A question, really.
Most big name stars — A list, B list, mostly — have at least one video game voiceover over on the IMDB credits list. Some of them are understandable. I mean, I bought the LOTR games just because Elijah Wood’s voice is in them, omg! But is it possible, even just a little, that some of these games have big celebrity voiceovers just as a way to fool some of us gamers into thinking it has to be an awesome game, even if it’s not? Discuss pls. Kthxbye.

Sunday, June 28, 2009

We Love The 90s for PC Games


PC gaming has been around since, well, PCs have existed. However the platform always seemed to get the short end of the attention stick compared to console game titles (with a few exceptions). That all changed in the 1990s as PC gaming entered a new golden age. With developments like shareware titles, CD-Rom games, Internet-based titles, Windows 95 and more powerful PCs, quite a few titles got a ton of attention and sales.In this new multi-part feature we look back at that glorious decade of the 1990s as we focus on just some of the games that we loved to play and in some cases still play from that decade. Today we look back at a few of our favorite titles from 1990 and 1991.The Secret of Monkey Island
MONKEYS! MONKEYS! MONKEYS! Sorry, had to be done. Yes, the LucasArts adventure game factory was in full swing in 1990 when it released this beloved game. Developed by Ron Gilbert, the game had a lead character with one of the best names ever: Guybrush Threepwood. You can't go wrong with that kind of name.With lots of humor, pirates and terrifically fun gameplay, The Secret of Monkey Island was a big success for LucasArts and spawned three sequels. However, the lack of interest in adventure games in the new millennium caused the series to end with its fourth installment in 2000.

Video Game Design Between 1990-2008


Remembering the good all days, when me and my friends were playing Doom, Mortal Kombat, Quake or War Craft on a Pentium 133 MHz computer with Sound Blaster and a 4MB video card. A lot has changed since then.
Video games actually are a lot older than that we used to play in the 90s. The first video game was created in 1947 called Tennis for two and it was played on an oscilloscope like device.
With the release of Apple II, Commodore 64 and the ZX Spectrum in late 70s early 80s people could afford to buy such devices and to play video games from their own home. By the mid 80s the video game industry started to evolve at a fast pace releasing games such as Zork, Battlezone and Bard’s Tale.

How games are made


Abandonia is an abandonware website, focusing mainly on showcasing computer games and distributing games made for the MS-DOS system.
Abandonia also features a music section and an Abandonware List,1 a continuously expanded database of over 4600 games including information about their publishers, release dates and whether according to the staff's knowledge the software is sold, protected or abandoned. This list is a sum total of research and enquiries made by the site crew, with sources including but not limited to MobyGames, Wikipedia and the company registry at Home of the Underdogs.
Abandonia Reloaded sometimes referred to as "AR" or simply "Reloaded" is a sister project of Abandonia, with the focus upon freeware games.
Every game showcased is accompanied by a set of screenshots, and reviews written and proof-read by members. As non-profit sites, both Abandonia and Abandonia Reloaded are community-driven projects. With the exception of the featured games themselves, all content available on the sites is created by their community as a volunteer effort.
Both also feature a progressive system of game evaluation, in which the quality of the game is rated not just by the reviewer but also by votes of regular visitors.

History of Games....


In 1952, A.S. Douglas wrote his PhD degree at the University of Cambridge on Human-Computer interraction. Douglas created the first graphical computer game - a version of Tic-Tac-Toe. The game was programmed on a EDSAC vaccuum-tube computer, which had a cathode ray tube display.
William Higinbotham created the first video game ever in 1958. His game, called "Tennis for Two," was created and played on a Brookhaven National Laboratory oscilloscope. In 1962, Steve Russell invented SpaceWar!. Spacewar! was the first game intended for computer use. Russell used a MIT PDP-1 mainframe computer to design his game.
In 1967, Ralph Baer wrote the first video game played on a television set, a game called Chase. Ralph Baer was then part of Sanders Associates, a military electronics firm. Ralph B.

Saturday, June 27, 2009

What Are Games??

A video game is an electronic game that involves interaction with a user interface to generate visual feedback on a video device. The word video in video game traditionally referred to a raster display device. However, with the popular use of the term "video game", it now implies any type of display device. The electronic systems used to play video games are known as platforms; examples of these are personal computers and video game consoles. These platforms range from large computers to small hand held devices. Specialized video games such as arcade games, while previously common, have gradually declined in use. The input device used to manipulate video games is called a game controller, and varies across platforms. For example, a dedicated console controller might consist of only a button and a joystick. Another may feature a dozen buttons and one or more joysticks. Early personal computer games often needed a keyboard for game play, or more commonly, required the user to buy a separate joystick with at least one button.Many modern computer games allow, or even require, the player to use a keyboard and mouse simultaneously. Video games typically also use other ways of providing interaction and information to the player. Audio is almost universal, using sound reproduction devices, such as speakers and headphones. But other feedback may come via haptic peripherals, such as vibration force feedback.