andrewdonshik Posted October 22, 2012 Share Posted October 22, 2012 GeForce 210, howzzat? Link to comment Share on other sites More sharing options...
jakj Posted October 22, 2012 Author Share Posted October 22, 2012 GeForce 210, howzzat? Not bad, solidly in the GL 3 / DX 10 range. Not top-of-the-line, but good enough for another 2-5 years. Link to comment Share on other sites More sharing options...
andrewdonshik Posted October 22, 2012 Share Posted October 22, 2012 Not bad, solidly in the GL 3 / DX 10 range. Not top-of-the-line, but good enough for another 2-5 years. Ok, good. It was the only 30.00 one that worked with osx. Link to comment Share on other sites More sharing options...
Kruziik_Kel Posted October 23, 2012 Share Posted October 23, 2012 im currently stuck with an 82945G chipset but soon provided the 8970 doesnt come out before then ill have a nice shiny Sapphire 7970 3GB GHz edition with the Vapor X cooler Link to comment Share on other sites More sharing options...
GreenWolf13 Posted October 23, 2012 Share Posted October 23, 2012 I do not have a card, I have Intel Integrated Expresss Chipset 82945G. I thought you had a Cool Ranch Dorito for a graphics card. Not sure what graphics card I have, pretty sure it's an integrated graphics card, but I'll have to check. Link to comment Share on other sites More sharing options...
Hushful Posted October 23, 2012 Share Posted October 23, 2012 I thought you had a Cool Ranch Dorito for a graphics card. Not sure what graphics card I have, pretty sure it's an integrated graphics card, but I'll have to check. Cool Ranch Dorito was the development name for that integrated chip set. Link to comment Share on other sites More sharing options...
BurningCake Posted October 23, 2012 Share Posted October 23, 2012 ^Was it really? >_> Anyway, I got a nVidia Geforce (EVGA) 550 Ti card, standard clocking. I get a good 200 fps on vanilla, 100 something fps with Technic/Tekkit. Link to comment Share on other sites More sharing options...
GreenWolf13 Posted October 23, 2012 Share Posted October 23, 2012 ^Was it really? >_> Anyway, I got a nVidia Geforce (EVGA) 550 Ti card, standard clocking. I get a good 200 fps on vanilla, 100 something fps with Technic/Tekkit. With optifine installed and everything turned to the lowest settings, and render distance at short, I get about 30 fps in vanilla, 40 when I set render distance to tiny. In tekkit, I get the same fps as I do in vanilla. It's an improvement over the 3-4 fps I would get on tiny render distance without optifine in vanilla. Link to comment Share on other sites More sharing options...
jakj Posted October 23, 2012 Author Share Posted October 23, 2012 Without Optifine, your graphics card's effect on your framerate is precisely dick. Minecraft's base code is horrible: It breaks down and reestablishes GL state for every single block, item, and entity, during every single frame, in -addition- to not even using VBOs to reduce bus-bandwidth consumption, making your CPU power, your RAM's speed and frequency, and your system-bus's width and frequency the most-important things. (Some rudimentary VBO code is actually in Minecraft, but it's conditionally linked to an always-false variable, so in effect it's commented out. Not that it would do a whole lot of good anyway, because the VBO implementation is just a round-robin orphaning scheme that still involves the same amount of communication across the system bus, and the same amount of state-alteration. Part of the problem is that a lot of the code, when specifying vertices, actually recalculates the vertices every frame instead of using the transformation matrix, so there's very little the GPU could cache in the first place.) Just imagine a glorious world with a rewritten graphics engine using VBOs: So little space would even be required! I'm fairly certain you could fit every single model in vanilla Minecraft in a single index/vertex VBO pair no bigger than 1-2 megs, and the only thing going over to the graphics card per object would be a single transformation matrix of only 16 floats (64 bytes) as opposed to the minimum of 232 bytes being transferred now (8 vertices, 3 floats per vertex, 4 bytes per float; 36 shorts per index, 2 bytes per short; and 16 floats for the matrix, 4 bytes per float; and that's just for one single cube!). Optifine really is a work of art. Link to comment Share on other sites More sharing options...
andrewdonshik Posted October 23, 2012 Share Posted October 23, 2012 Optifine really is a work of art. Yes, yes it is. That would be why MC is all CPU. Link to comment Share on other sites More sharing options...
GreenWolf13 Posted October 23, 2012 Share Posted October 23, 2012 Ah, that explains why I get good frame rate in games like TF2 or Portal. My graphics card must be better than I thought. My computer isn't optimized for gaming, so the CPU isn't amazingly powerful, but it's powerful enough to run games like TF2 and other high end games. The more I listen to you jakj, the more I realize just how bad a coder Notch is. Also, according to you, 3+dick=3. Link to comment Share on other sites More sharing options...
andrewdonshik Posted October 23, 2012 Share Posted October 23, 2012 Ah, that explains why I get good frame rate in games like TF2 or Portal. My graphics card must be better than I thought. My computer isn't optimized for gaming, so the CPU isn't amazingly powerful, but it's powerful enough to run games like TF2 and other high end games. The more I listen to you jakj, the more I realize just how bad a coder Notch is. Also, according to you, 3+dick=3. Yea. Notch had ideas, but no coding skills. Link to comment Share on other sites More sharing options...
GreenWolf13 Posted October 24, 2012 Share Posted October 24, 2012 Yea. Notch had ideas, but no coding skills. Not even ideas really. Minecraft is not the first sandbox game, and it is definitely not gonna be the last. And have you noticed that some of the coolest things have been implemented by jeb? Pistons, dragons, everything after 1.0 Notch came up with a great way to combine parts of different games. Jeb made it amazing. Link to comment Share on other sites More sharing options...
andrewdonshik Posted October 24, 2012 Share Posted October 24, 2012 Not even ideas really. Minecraft is not the first sandbox game, and it is definitely not gonna be the last. And have you noticed that some of the coolest things have been implemented by jeb? Pistons, dragons, everything after 1.0 Notch came up with a great way to combine parts of different games. Jeb made it amazing. True. Sorry, I thought jeb took over after 1.0. Notch isn't the idol he's made out to be. Link to comment Share on other sites More sharing options...
GreenWolf13 Posted October 24, 2012 Share Posted October 24, 2012 True. Sorry, I thought jeb took over after 1.0. Notch isn't the idol he's made out to be. Jeb did take over after 1.0, but he was working on minecraft for a long time before that. Most of the adventure update was done by Jeb, as well as quite a bit of 1.7 Link to comment Share on other sites More sharing options...
jakj Posted October 24, 2012 Author Share Posted October 24, 2012 Pistons Already done by a mod before adding to vanilla. dragons You call that an original concept creditable to one person? everything after 1.0 Like redstone repeaters (done by a mod already)? Temples in world generation (done by a mod already)? Anvils for repair (done by a mod already)? Am I getting closer? Link to comment Share on other sites More sharing options...
GreenWolf13 Posted October 24, 2012 Share Posted October 24, 2012 Already done by a mod before adding to vanilla. You call that an original concept creditable to one person? Like redstone repeaters (done by a mod already)? Temples in world generation (done by a mod already)? Anvils for repair (done by a mod already)? Am I getting closer? I'm not saying Jeb came up with those, I'm saying he implemented them. And redstone repeaters have been in vanilla for ages. Link to comment Share on other sites More sharing options...
Jay? Posted October 24, 2012 Share Posted October 24, 2012 GTX550ti. Currently debating whether i should get a second one of those and go dual sli. Link to comment Share on other sites More sharing options...
jakj Posted October 24, 2012 Author Share Posted October 24, 2012 GTX550ti. Currently debating whether i should get a second one of those and go dual sli. Considering a lot of games don't even handle multi-core CPUs well at all yet, SLI is a mug's game unless you have a lot of money to burn or you know that there is at least one game that needs it and uses it well that you intend to play for a long time to come. If that isn't the case, a smarter move (in my opinion) would be to get a newer one when it comes down a bit more in price, like a 570 or 580, keep the one you have as a backup so you're not card-less if you have a burnout, and later on down the line, you can get another 570/580 for even cheaper and do 2x or even 3x SLI with them for less money than a new one of the next line, once games are utilizing it better. Link to comment Share on other sites More sharing options...
freakachu Posted October 24, 2012 Share Posted October 24, 2012 multi GPU is a lot different than multi CPU, in terms of programming required to make it work. most of the work is done on the driver, by splitting the load between the cards and making each one only render every other frame. the program in many cases doesn't even have to know it's happening. Link to comment Share on other sites More sharing options...
jakj Posted October 24, 2012 Author Share Posted October 24, 2012 Really? Interesting. I guess it just keeps the VRAM synchronized between them instead of pooling them. Link to comment Share on other sites More sharing options...
freakachu Posted October 24, 2012 Share Posted October 24, 2012 Really? Interesting. I guess it just keeps the VRAM synchronized between them instead of pooling them. I believe it does that as well, to some extent. I'm not super informed on the nitty gritty of it, but that is my understanding. I'm sure that I am oversimplifying it by quite a bit Link to comment Share on other sites More sharing options...
andrewdonshik Posted October 24, 2012 Share Posted October 24, 2012 Just curious, if the ATI Radeon X1600 was better than my current GeForce? Link to comment Share on other sites More sharing options...
jakj Posted October 24, 2012 Author Share Posted October 24, 2012 Just curious, if the ATI Radeon X1600 was better than my current GeForce? Looks like it's a lot worse than the GeForce, actually, if you're referring to the one you posted about here previously. Link to comment Share on other sites More sharing options...
andrewdonshik Posted October 24, 2012 Share Posted October 24, 2012 Looks like it's a lot worse than the GeForce, actually, if you're referring to the one you posted about here previously. Yep, that was what I was reffering to. That's good, because I wasn't regressing to get OSX working. Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now