News/Info NVidia 3000 series and more technical info, include protocols, softwares, etc..

emailx45

Местный
Регистрация
5 Май 2008
Сообщения
3,571
Реакции
2,438
Credits
573
TDP, TBP or TGP by Nvidia: i'ts same thing for Nvidia = power limit GPU Boost = NO MUCH than the values below!
  • Power consumption is a term a bit vague because NVIDIA uses acronyms like TDP (widely used in the world of graphics cards) and TGP to talk about the power required by the graphics card.
    • TDP: Thermal Design Power (or Thermal Design Point or Thermal Design Parameter)
    • TGP: Total Graphics Power
    • TBP: Total Board Power (or Typical Board Power, used by AMD)
    • GCP: Graphics Card Power
    • MPC: Max Power Consumption
  • So, this values reference itself to "MAX value" in game use, calculated on max boost of GPU! Never beyond this!
    • Here, an observation on 20xx series (only): the USB port dont is into this value! Then, you should sum VGA TDP + USB power consume = total consume power!
    • Then, TDP and TGP is not the same thing in this serie!
    • Approximately 20W are used for 1 USB port, and must be added to the TDP of the VGA to obtain the final consumption value.
  • GeForce 3070 = 220Watts
  • GeForce 3080 = 320Watts
  • GeForce 3090 = 350Watts
  • 2 power-connector 8pin is ~300watts (glimpsing the OverClock), however, one could use 1 (8pin) connector and 1 (6pin) connector, at best as a power source!
PCIe by Nvida
  • since PCIe 1.0, it should works BUT each performance different (or lower), of course, the all components and all components involved will contribute to better the worst performance + adequate energy source for the hob to work!
  • But the Nvidia representative in Brazil says that PCIe is PCIe!
  • In order to better enjoy all the performance of the new line, it is appropriate to have the minimum requirement announced! That is, PCIe 4.0
Notebooks target
  • Nvidia says it will hit as many targets as possible with the new series, and Notebooks are in the works as it is possible. Despite the energy issue and TPD presented for Desktops.
Target for new serie
  • GeForce 3090 -> only enthusiast and "who" needs it (the key-word), like IA team, streamers in 8K, IA researchers , etc.. big necessity of memory and computing data, use of combination of cards using NVLink to professional applications, combining processing and memory of all cards linked (like SLI, but this is not more used by games programming technics, for example) but using different modern technics!
  • for example: Quadro RTX8000, 48GB VRam = U$8000.00 ----> 2 GTX3080, 24GB VRam 2x U$1500.00 = U$3000.00
  • else, anyone that have money to buy it!
  • for the rest mortals, GTX3080 will be the big choiced!
  • GTX3070 for peoples, wonders itself with new real-time processing
The "NANO" building
  • now, Samsung is the company that built the new 8nano chip. Before it was TSMC, but because of technical bottleneck issues at the company, it cannot supply the 7nano to Nvidia. This was also a factor in obtaining lower values in the final value of the new 3000 series.
  • But even so, the overall performance of the components is considered to be 1.9x the previous ones, in general. For example, the GTX3070 is equivalent to a GTX2080 Ti, at a much lower cost, and much higher performance!
VRAM in 3000 serie
  • The NVidia representative says that if you have VRam memory on your card, many games will actually use it from now on.
  • Example of memory usage:
  • Red Dead Redepmption 2, using 4K Ultra = 6GB VRam (not 8GB as many speculate on the internet), in 60fps on Nvidia's internal tests. The GTX3070 delivers 4K Ultra, 60fps, so the 10GB VRam remains for design reasons due to the 320-bit bus capacity = more chips = more card capacity.
  • He talk about use of "AfterBurning" memory allocation, that is not necessaraly is used by game, but by the system ( Who is the system? )
    • read about pre-fetching in operating system!
    • see the software cool: "Intelligent Standby List Cleaner" by WagnadSoft.com / owner of DDU Uninstaller!
    • and "RAMMap" by sysInternals.com to verify the RAM consume by system pre-fetching!
  • Have more memory than necessary for the game, dont cause any problem with performance.
  • There are more benefits to having an SSD, especially the latest generation, than having more memory on the video card. It results in the speed of data delivery and not storage.
  • This is because the new consoles are betting on the use of the SSD in the new models already launched, like the PS5
The Streams Multiprocessors (SM):
  • The SM (streams multiprocessors) of the new series is capable of decompressing up to 100GB / s. The GTX3080 has 68 SM
Use of SSD NVME with serie 3000
  • With RTX-IO does not necessarily depend on an NVME v4.0 SSD, it can speed up SATA SSD and even HD mechanics.
  • On a mechanical HD, game loading time can even be cut in half!
  • SSD SATA, the load time of games can approach an SSD NVME with the decompression of data done on the CPU. Then, the CPU will have this function to optimize the load, which will generate more data processing and relevant overload of the same = more energy = more heat.
  • Some game, will can request a SSD as "minimum request" to works. But this is not the rule, for while, and nothing about Nvidia 3000 series!
  • World of Warcraft Shadowlands, request minimun: SSD with 100GB avaliable
    • UPDATED: was remove this request minimun for SDD use!
Reflex SDK to optimize the internal functions of the Games
  • Some games already use this new SDK for developers, such as: Fortnite, APEX, CoD: ColdWar, among others to come.
  • The Reflex SDK (Nvidia), makes a choice of the necessary actions at the appropriate time, not necessarily, anticipating the readings, but choosing the most appropriate time to optimize the game's code loads, and thus deliver a better performance at the end end , that is, in the eyes of the gamer on the display.
  • With this new technique, Nvidia always uses the video card's "auto-performance" mode, to tell the CPU to always work at full speed, and no longer in economical mode. Which is one of the reasons for the "bottlenecks" in the game scenes.
  • Regardless of the FPS rate on a screen, the system latency is very variable, which is called "end-to-end", that is, from the click of the mouse to the action performed on the screen or in the game, as you wish.
  • The entire chain is long: hardware mouse + mouse driver + USB driver ... + CPU used, game software response + Video driver + Video Card hardware + Interfaces + Display in use (Panel and Hertz response) ..., that is, hardware or software, everyone is responsible for consuming some time in processing what was requested.
  • It will no longer be necessary to make this configuration in the "Control Panel" of the NVidia video card driver!
  • From now on, maximum mode is the watchword!
  • This will be a standard, both for games that use 100% GPU and 100% CPU.
  • In practice, or in theory for some, with REFLEX on, you will be able to achieve an equivalent latency in 1080p in 1440p, in 4K equivalent to a 1440p, with technical limitations involved, of course. It should not be taken literally, because each system is a system, and, it must be studied with its nuances.
  • The REFLEX SDK works since Nvidia 900, 10xx, 20xx and 30xx series
Historically, latency is measured:
  • move the mouse, check the "led"
  • shoot at 1000fps and count the number of frames between the "click" and the output image to be shown on the display screen.
  • tecnich expensive and not avaliable for all people!
  • Now on 360Hz displays, there is a new component that performs latency measurement.
    • you plug your mouse into the display's USB port
    • the equipment detects the mouse click
    • and, as an optical sensor placed in the middle of the screen that can detect the frames necessary to show the changes in images on the screen. For example, the effect of a gun shot in a game, how many frames are taken per second during mouse clicks!
  • of course, that a Display with 360Hz, is very expesive too! but, it's very pratice than before!
A brief games comparison to reveal some info:
  • Red Redemption 2
    • Nvidia 2080 Ti, 4K Ultra, ~ 50fps
    • Nvidia 2080 Ti, 4K Ultra,> 60fps
  • Flight Simulator (CPU intesive use)
    • Nvidia 2080 Ti, Ultra, ~ 40fps
    • Nvidia 3080 Ti, Ultra, should easily overlap with the optimizations that will be performed by Microsoft, due to the exclusive use of the CPU.
    • Thus, Nvidia should be able to use its new series to process the large volume of information on the maps. for example, use: DLSS, DirectX12, Reflex etc... some should be implemented!
Use of VR on this serie:
  • No official information, and due to costs for the end user, is not in the plans of any use of this technology.
  • VR did not have the sales necessary for companies to develop this business niche in general.
  • It just did, and does, make sense in professional media of computer graphics and artificial intelligence.
  • An example of this is the film "Rei Leão", all shot using VR as a way for the director to digest the scenes as if he were experiencing them.
  • So, you can see how expensive it is to have this system in hardware aimed at the general public.
CPU indicated to this serie:
  • Nvidia representant: Any CPU that can deliver 60fps can be used for this purpose.
    • for example, CPU with 6 cores, it's enough to reach 60fps!
    • The 3000 series does not necessarily require a specific CPU to function.
    • It all depends on the purpose of use, being at the user's discretion to direct it.
    • if you a Intel i3 with 4 cores..., it dont reach 60fps, then... your GPU dont will solve this limitation!
    • tested by Nvidia representant in Brazil:
    • Rainbow Six: Siege, in 360fps minimum, ok! you need the last CPU to keep it in all time, or, almost all time, my friend!
      • using AMD RAYZEN 3600 X... was not possible! (~300fps)
      • using Intel (6 cores) like 8700K, a i5 can reach it
    • Call Of Duty: WarZone
      • needs a CPU very very great to pass of 200fps! HEAVY GAME!
    • Tendence: heavy games will need 8cores for 60fps
  • ME: If you are an enthusiastic gamer, then you should build your system according to your intended use!
  • ME: If you are a weekend gamer, then your needs will be less, or at least, you will not need so much hardware, agree?
    • Spend your money on a tour with the people you care about most, so your satisfaction will be full and in real time with billions of colors, in an FPS with no screen limit.
And Display TV, what?
  • with new HDMI v2.1
    • 4K at 120Hz, it's guaranteed!
    • 8K at 60Hz, with 1 cable it's ok!
  • an example: LG 4K, 120Hz, G-Sync, OLED, i'ts the most!
Software support on serie:
  • AV1 Codec: video compression, royalt-free, it should adopted for many developers! (Netflix will)
    • comparing with 4K film at H264 needs a 20Mbits connection, can needs just ~5Mbits in AV1
  • Encode ShadowPlay will be updated to support 8K and HDR! Capture direct on card!
  • Comin soon... Tweet transmission at 1440p/144fps!
  • RTX Broadcast (CAM and AUDIO) by app (RTX Broadcast) works in 20xx/30xx series! (all RTX)
    • AUDIO: it works for you and when you have the sound on your headphone (your friends called)!
    • CAM: Background remove, blur effect, face-follower, etc...
 
Последнее редактирование:

emailx45

Местный
Регистрация
5 Май 2008
Сообщения
3,571
Реакции
2,438
Credits
573
by NVidia official site!What?GEFORCE RTX
3090
GEFORCE RTX
3080
GEFORCE RTX
3070
GPU Engine Specs:NVIDIA CUDA® Cores1049687045888
Boost Clock (GHz)1.701.711.73
Memory Specs:Standard Memory Config24 GB GDDR6X10 GB GDDR6X8 GB GDDR6
Memory Interface Width384-bit320-bit256-bit
Technology Support:Ray Tracing Cores2nd Generation2nd Generation2nd Generation
Tensor Cores3rd Generation3rd Generation3rd Generation
NVIDIA ArchitectureAmpereAmpereAmpere
Microsoft DirectX® 12 UltimateYesYesYes
NVIDIA DLSSYesYesYes
PCI Express Gen 4YesYesYes
NVIDIA® GeForce Experience™YesYesYes
NVIDIA AnselYesYesYes
NVIDIA FreeStyleYesYesYes
NVIDIA ShadowPlayYesYesYes
NVIDIA HighlightsYesYesYes
NVIDIA G-SYNC®YesYesYes
Game Ready DriversYesYesYes
NVIDIA Studio DriversYesYesYes
NVIDIA GPU Boost™YesYesYes
NVIDIA NVLink™ (SLI-Ready)Yes--
Vulkan RT API, OpenGL 4.6YesYesYes
HDMI 2.1YesYesYes
DisplayPort 1.4aYesYesYes
Для просмотра ссылки Войди или Зарегистрируйся7th Generation7th Generation7th Generation
Для просмотра ссылки Войди или Зарегистрируйся5th Generation5th Generation5th Generation
VR ReadyYesYesYes
Display Support:Maximum Digital Resolution (1)7680x43207680x43207680x4320
Standard Display ConnectorsHDMI 2.1, 3x DisplayPort 1.4aHDMI 2.1, 3x DisplayPort 1.4aHDMI 2.1, 3x DisplayPort 1.4a
Multi Monitor444
HDCP2.32.32.3
Founders Edition Card Dimensions:Length12.3" (313 mm)11.2" (285 mm)9.5" (242 mm)
Width5.4" (138 mm)4.4" (112 mm)4.4" (112 mm)
Slot3-Slot2-Slot2-Slot
Founders Edition Thermal Power Specs:Maximum GPU Temperature (in C)939393
Graphics Card Power (W)350320220
Required System Power (W) (2)750750650
Supplementary Power Connectors2x PCIe 8-pin2x PCIe 8-pin1x PCIe 8-pin
1 - Up to 4k 12-bit HDR at 240Hz with DP1.4a+DSC. Up to 8k 12-bit HDR at 60Hz with DP 1.4a+DSC or HDMI2.1+DSC. With dual DP1.4a+DSC, up to 8K HDR at 120Hz
2 - Requirement is made based on PC configured with an Intel Core i9-10900K processor. A lower power rating may work depending on system configuration.
Note: The above specifications represent this GPU as incorporated into NVIDIA's reference graphics card design. Clock specifications apply while gaming with medium to full GPU utilization. Graphics card specifications may vary by add-in-card manufacturer. Please refer to the add-in-card manufacturers' website for actual shipping specifications.
 
Последнее редактирование:

eatmaster

Турист
Регистрация
29 Июл 2020
Сообщения
274
Реакции
29
Credits
0
Гиганты) Для истинных ценителей игр. Киберспорт - довольно скользкая вещь
 

eatmaster

Турист
Регистрация
29 Июл 2020
Сообщения
274
Реакции
29
Credits
0
Да толку от этого RTX 3080 :ROFLMAO::ROFLMAO:. Через год новую видеокарту выпустят, которая будет как минимум в 1.5 раз круче, а то и во все 2 раза. Не верю я им
 

emailx45

Местный
Регистрация
5 Май 2008
Сообщения
3,571
Реакции
2,438
Credits
573
my last buy as 2013... gtx650 ti boost 2gb! :eek::ROFLMAO::ROFLMAO::ROFLMAO:
until now, working very good including w CONTROL, DOOM 64, FAR CRY 5 and ... Shadows of Tomb Raider.
but i had money ... I would try new rtx3080
 

eatmaster

Турист
Регистрация
29 Июл 2020
Сообщения
274
Реакции
29
Credits
0
У меня тоже 2gb видеопамяти. Только вот не на настольном ПК, а на ноутбуке... Да и нечасто я играю. Прошел DOOM 2016 и на этом все ;)