I am a computer science student, whose job includes writing computer programs, and my field of interest is Computer Graphics, with more emphasis in real-time applications such as computer games. In this application, there is a 30 frames-per-second boundary that must be maintained, so that an image is shown on the display device at least every 1/30 s. I know that today’s computers are already very fast in processing graphics, thanks to the development of GPU, but I am *always* worried whether the algorithm that I implement will severe the frame rate, even if it just computes the length of a vector using Pythagorean theorem, which says the magnitude of a vector is defined by the square root of the sum of the squared components of the vector. A very cheap computation, yet makes me afraid to implement.
lambrtz looks like this
You can write comments in any language that you want, but please bear in mind that I only understand 4 languages: English, Indonesian, Javanese and Malay.