Totalvidemem?
Blitz3D Forums/Blitz3D Programming/Totalvidemem?| 
 | ||
| How do I pass those bytes to megabytes ?, I divide, multiply? and that number? | 
| 
 | ||
| Global bytes Global kilobytes Global megabytes Global gigabytes Global terabytes bytes = 1024 kilobytes = (bytes*1024) megabytes = (kilobytes*1024) gigabytes = (megabytes*1024) terabytes = (gigabytes*1024) I'm pretty sure that's the math involved. Please correct me if I'm wrong, math nerds :P | 
| 
 | ||
| Print TotalVidMem()*1024 Total = Megabytes? | 
| 
 | ||
| Try /1000. I'm not a mathematician. I vaguely remember how that formula goes. | 
| 
 | ||
| Ok, here: TotalVidMem()/1024/1024 Bytes to KB to Megas. | 
| 
 | ||
| I have a problem, every time you made ​​a change screen resolution, it is worth otalVidMem () / 1024/1024-AvailVidMem () / 1024/1024 up and up and up, so that it can not be, if so that get the most value graphics card 1024 megs, but nothing happened, I think this is returning a wrong value. | 
| 
 | ||
| It might be an issue with the actual value of TotalVidMem() being larger than what's possible to display with 32-bit numbers |