Technical art can be described as a field of study that companies dealing with computer graphics utilize to produce things like video games, animated films, AR/VR experiences, and VFX for live-action movies.
Technical artists are also responsible for bringing some much-needed realism to the picture. They are there to balance an artist’s creative freedom to be realistic and functional in a game or movie. If, for example, an artist creates a nefarious villain with super-spiky shoulder pads, then the TA’s job includes figuring whether or not those spikes might take the villain’s head off when he raises his arms. Just because they look really cool, doesn’t mean it’s a realistic option. If there’s a danger of imminent decapitation, that’s No Bueno and the TA will have to request a new design.
By hiring technical artists, a studio can divide the workload of developing and maintaining a game’s pipelines and tools between TAs and programmers. This, in turn, allows the programmers to concentrate on what they do best — develop game code. TAs collaborate with artists and programmers to make the game development process more streamlined, and that results in a win-win situation for everyone.
There seem to be three common breeds of Technical Artists, and honestly, some people will argue with me. That’s fine. It’s complicated. I know. First, there are technical art generalists on the Tools/Engine Team. Second, there are Character Technical Artists on the character & animation team. They’re working on complex character and set-piece rigging. And third, but probably not last, Design Technical Artists on the Physics & Prop teams. These folks are working on dynamic props and asset setup for more simple mechanical rigging. Then there are folks that can get super niche and focus on VFX, Destruction, Environment, Lighting, and other stuff, but I could really get off track here if we try to get too specific on all the varieties.
Regardless of what variation of technical artist they are, they tend to focus on one or more of the following disciplines:
Rigging is a technique that depicts a 3D model via a series of linked digital bones. This bone structure is then utilized to handle the 3D model pretty much like a puppet. Rigging can be used to bring anything to life, such as animals, spaceships, etc. The point is adding bones allows the model to be animated freely and with ease. And, for some folks, rigging makes it easier to map the movement of models or even their destruction. For more on rigging, you can check this link.
- Rigging Dojo has an awesome blog with Technical Artists and Animators sharing their insight on preparing for an interview that focuses on rigging, animation, Mel, and python skills: https://www.riggingdojo.com/category/interview/
- Tech-Artists.org also has a lot of open-source channels to talk about different problems: http://discourse.techart.online/
- LesterBanks is a great website for anyone looking for info on 3D artists, Motion Designers, etc. And, they also happen to have awesome interviews from the Game Developers Conference, which you can find here.
- Want to learn about Vertix Skinning? Check out this article
- Other general insights on things like GimbalLock, Interpolation, InverseKinematics, etc. check out this article for characters, vehicles, etc, and binding vertices to them. This can involve bones, constraints, controllers, expressions, cloth, skin, etc: http://wiki.polycount.com/wiki/Category:Rigging
Everyone has heard of shaders, or shader-based engines or software, but most people don’t know what they are, or what the craze is. The most succinct explanation I can think of for a shader is: A shader takes something, does stuff to it, and gives you something else.
Depending on who you are, that is either the most mundane explanation, or the most intriguing explanation. Obviously, programmers find it intriguing. And I hope you will too.
Shaders come in two sorts. Vertex shaders, and pixel shaders. They go together hand in hand, but it is pixel shaders that are where the magic happens.
A shader is a bit of computer code that is commonly utilized to define how a surface will be rendered. It takes some inputs (textures, vertices, view angles, etc.), does some changes to them, then tells the game renderer to render them. Shaders are typically used for interactive renderings, like in a 3D game, where the view is rendered in real-time at 30 fps (or better).
“Real-time” shaders are optimized to render efficiently, trading better performance for less accuracy. Conversely, “offline” shaders are used with non-real-time renderers, like V-Ray or mental ray, and have more accurate effects like ray-tracing, and sub-pixel filtering. But, they are non-interactive (they can take several minutes to render each frame). I found some really nifty links that can be super helpful from polycount wiki;
- 3ds Max Shaders
- 3ds Max viewport shader by Laurens “Xoliul” Corijn.
- More info in the Polycount thread Xoliul’s 3DS Max Viewport Shader.
- 3Point Shader by 3 Point Studios
- Shader with many options, including near-perfect results for 3ds Max generated normal maps. See the Polycount thread 3Point Shader Lite — Shader material editor and Quality Mode normalmaps for 3ds Max.
- Agusturinn Shader Demo by Wang “RTshaders” Jing.
- Maya Shaders
- BRDF shader for Maya by Brice Vandemoortele and Cedric Caillaud
- More info in the Polycount thread Free Maya/max cgfx/fx Shader. Update: New version here with many updates, including object-space normal maps, relief mapping, self-shadowing, etc.
- DOTA2 Hero Shader — 3DS & Maya Shader Material by Luigi ‘Ace-Angel’ Kavijian and Drew ‘Drew++’ Watts.
- Half-Lambert diffuse term with controllable falloff, additive rim lighting, Phong specular, mask support (Dota2 style), all the fancy effects Valve uses!
- Has several features such as normal mapping, specular, gloss, reflections, ambient cube, parallax, etc. More info in the Polycount thread “KoddeShader”, a Maya CGFX shader. Update: Version 2.0 here including updates such as blended normals for skin, cube map mip level parameters for blurring reflections and ambient light, 2-pass transparency support, etc.
For more insight on beginner, intermediate, and advanced tutorials, go to polycount’s wiki on shaders; http://wiki.polycount.com/wiki/Shaders
The Art Pipeline — GPU/Rendering/Art Architecture:
Pipeline programming — The art pipeline is basically the process of building a video game from conception to completion. The process includes 4 stages: concept, pre-production, production, and post-production. The pipeline helps organize the flow of work (think Trello — but in a big way) so that everyone is aware of the work they need to turn in and their deadlines. The pipeline helps manage the budget, timeline, and reduces bottlenecks and inefficiencies.
Generally speaking, the technical artist should be able to design and develop all art pipelines necessary for the game. In this sense, part of the technical artist’s role is to be a pipeline and systems architect. Every gaming studio works slightly differently from each other. You’ll hear some studios talk about how the Technical Artist are their Linchpins are SUPER valuable.
Depending on the system, they work with the programming and art departments to determine what works best for both parties and try to reach common ground. At this level, you could say that they act as negotiators between the technical- and content-oriented disciplines. They get to help art and engineering decide how cool their graphics for decapitations in the game can be with console/pc/mobile constraints mixed with the engineers ability to scale GPUs for image rendering.
However, designing critical game systems requires technical artists who have intimate knowledge of both the game engine and development hardware, such as the Xbox 360 or PlayStation 3. The degree of knowledge necessary is such that if the technical artist isn’t experienced enough or hasn’t made due diligence a priority, pipelines can quickly take a turn for the worse. More often than not, early mistakes are felt in the middle of production, or even later in the development cycle.
Not only do technical artists design and spec-out these systems (in coordination with other disciplines), they are the ones driving and championing the changes. Because of their intimacy with such systems, it makes them the primary source of information when it comes to how things work and fixing bugs in tools written to support the pipeline.
It is important to note that in most studios technical artists do not design code structure for the programmers. This level of granularity is neither their job nor our area of expertise. In contrast, they work with programmers at a higher level to develop the best way to get requested features from Point A (content creation) to Point B (the game) and everything in between.
Simon Schreibt explained the rendering pipeline with artist-friendly terms in a blog article: Simon Schreibt: Render Hell
Games aren’t the same if we cannot get mind-blowing explosions, dope fireballs, or awesome zombie deaths with sweet sweet K.O.s. So, we need technical artists that also understand how to use particles, decals, glows, blooms, and other things to create explosions, clouds, skies, water, and other visual effects that make those awesome moments happen.
Glow often refers to the visual effect of air being illuminated near a light source. This could be glow of a fireball being shot out of a wizard staff. This can be sun rays bursting through fog, mist, or dust in a dark scene. Or, this could be a flashlight on a foggy night. In a real-time 3D game, flows are faked using a variety of techniques. The term, “glow map” usually means Emissive map instead. Emissive does not illuminate the air; it does not change the silhouette of a model.
- A Game Art Trick: Doom 3 — Volumetric Glow by Simon “SimonT” Schreibt
- UDK Volumetric Light Beam Tutorial by Epic Games
- Using Glow in Unreal Engine 4 -https://youtu.be/Q45HW6hsJzQ — add YouTube Video
- Unity Tutorial: https://oxmond.com/glowing-orb-visual-effects-vfx/
PixelArt is created at extremely low resolutions, so individual pixels are clearly visible, and usually with palettized color (256 colors or less). Pixel art is a form of digital art, created through the use of software, where images are edited on the pixel level. The aesthetic for this kind of graphics comes from 8-bit and 16-bit computers and video game consoles, in addition to other limited systems such as graphing calculators.
- So You Want To Be A Pixel Artist?
- Extreme palette cycling examples
- Blowing S#!t Up the Bungie Way (color cycling for muzzle flashes in Halo 3)
- GraphicsGale pixel art editor ($22)
- Pro Motion pixel art editor ($78)
- Pixel Joint — great pixel art gallery
There are a ton of engines out there in the world, but there are some major ones that are used most commonly in the industry. So, I’m going to help by giving you a resource to three of them;
Now, there are a few other major areas we can go over, but as much as I love you investing your beautiful and valued time reading my article, I don’t want you feeling like this is a dissertation. We can save room in a future article to dive into other areas Technical Artists touch such as:
- Composition — is all about positive and negative spaces and how your characters interact with the background. It can either make or break your game because composition and layout is the first thing your users will experience. It requires a careful balance for your character to have enough space to move around comfortably, without appearing either caged or dominated by the background.
- Lighting — is an essential part of creating realistic scenes. It deals with the source(s) of light that bounces back from objects that are in your field of sight. In gaming, lighting effects can change the whole perception of a scene. For example, a night-time scene will require accurate lighting — just enough to create a suspenseful backdrop. Poorly executed lighting effects can exasperate players — remember the Battle of Winterfell in season 8 of GoT?
- Art-specific tools — A technical artist’s arsenal of tools can include pixel art tools, proficiency in Maya, etc. Some game engines exist to allow creators to make games out of the box, and some customizations can allow them to do things they could not do before. Most modern game engines are capable of allowing add-ons and plug-ins that are either developed in-house or are installed from third parties. These tools can go a long way in transforming how a game engine works.