• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Would you pay more for hardware with AI capabilities?

Would you pay more for hardware with AI capabilities?

  • Yes

    Votes: 746 8.3%
  • No

    Votes: 7,355 81.3%
  • Don't know

    Votes: 941 10.4%

  • Vote for this poll on the frontpage
  • Total voters
    9,042

ARF

Joined
Jan 28, 2020
Messages
4,103 (2.59/day)
Location
Ex-usa | slava the trolls
Would you pay more for hardware with AI features?

Nope.
This sounds like - would you pay more for a hardware with MMX, SSE or AVX instructions...
The engineers job is to design hardware with high performance when it is needed. Scalping, because a marketing buzz word is included... makes no sense, and is not justified.
 
Joined
Dec 5, 2013
Messages
619 (0.16/day)
Location
UK
AI may well have its uses, but I honestly can't wait until the current "echos of dot-com bubble" level of hysterical overhype dies down and the people who end up using it are those who actually need it, not "let's try and shoehorn it into anything & everything for the sake of jumping on a marketing bandwagon" currently doing the rounds.
 
Joined
Apr 2, 2011
Messages
2,680 (0.56/day)
No.

Current AI is largely using Large Language Models to approximate AI by training and refining responses. By the time you see somebody type "Donald Trump as Garfield" and see something akin to that live action Cats movie somebody has culled enormous amounts of data...which is less "I" and more "AT&E. That'd be less intelligence and more artificial trial and error. While that's fine for somebody wanting novel stuff...it's not yet worth spending real money on.


Don't believe me? Cool. I have a parallel. Raise your hand if you've got a Physx accelerator card in your PC right now.... I'll wait. Nobody? Ahhh....whenever they started including the hardware for physics calculations into GPUs it suddenly made sense to have them...just like right now AI is a buzz word and the people out there spending money on it are buying cards specifically for it....which inside of the two to three generations until it matters will eventually be eaten up and turned into components of existing cards rather than new hardware...
 
Joined
Jun 1, 2010
Messages
281 (0.05/day)
System Name Very old, but all I've got ®
Processor So old, you don't wanna know... Really!
As I've said many times before, I would prefer to remove any and all AI and other "non-rendering" oriented parts from the GPU, just in order to fill the "freed up" (or rather space not seized by AI) with more Compute Units, ROPs and TMUs, along with better memory for that price difference. Or at least cut out AI part, by making the die footprint much smaller, and cheaper, with better thermal performance. That also would mean that the GOU would be more capable of running higher resolutions with better settings, with much less reason to use any upscaler.
This is a big hell no. Make AI ASICs addins cards.
I would even expand the subject, and say, that RTX, or RTRT cards should have been addon cards as well. It would be perfect from business standpoint, and would have benefit consumers either. They could sell their GeForce GTX, and pair it with RTX addon cards (doubling their revenue from both product lines), to promote pure compute RTRT power. And everyone interesting in it would jump straight. There would be tons more sales. Just like people with older GeForce cards and Radeons were buying GTX ones for the sake of PhysX and encoding.

Also, the separate RTRT card, would have been a better solution, as it wouldn't be limited by common power envelope with raster GPU parts. There would be much more room to grow, and make RTRT card as powerful as possible. Such separate solution would benefit the "AI" crowd even more, as they don't need the raster and video output in every single card in their AI farms, as they need a compute power itself.

Would have been RTRT addon cards separate, it would strike AMD in much lesse degree, and they could dedicate more raster performance power into their Radeons, which wouldn't lower their value any time.
 
Last edited:
Joined
Jan 2, 2024
Messages
218 (1.43/day)
Location
Seattle
System Name DevKit
Processor AMD Ryzen 5 3600 ↗4.0GHz
Motherboard Asus TUF Gaming X570-Plus WiFi
Cooling Koolance CPU-300-H06, Koolance GPU-180-L06, SC800 Pump
Memory 4x16GB Ballistix 3200MT/s ↗3600
Video Card(s) PowerColor RX 580 Red Devil 8GB ↗1380MHz ↘1105mV, PowerColor RX 7900 XT Hellhound 20GB
Storage 240GB Corsair MP510, 120GB KingDian S280
Display(s) Nixeus VUE-24 (1080p144)
Case Koolance PC2-601BLW + Koolance EHX1020CUV Radiator Kit
Audio Device(s) Oculus CV-1
Power Supply Antec Earthwatts EA-750 Semi-Modular
Mouse Easterntimes Tech X-08, Zelotes C-12
Keyboard Logitech 106-key, Romoral 15-Key Macro, Royal Kludge RK84
VR HMD Oculus CV-1
Software Windows 10 Pro Workstation, VMware Workstation 16 Pro, MS SQL Server 2016, Fan Control v120, Blender
Benchmark Scores Cinebench R15: 1590cb Cinebench R20: 3530cb (7.83x451cb) CPU-Z 17.01.64: 481.2/3896.8 VRMark: 8009
would you pay more for a hardware with MMX, SSE or AVX instructions...
The answer was pretty loud back in the day when 3DNow! was cool. The screeching was also pretty loud on the day that VR users suddenly needed AVX instructions to run an update for Oculus audio spatialization codec. I've never seen such an oddity get patched out so fast. I get that there's a premium on new features but a lot of it has been outrageous. At some point there's gonna be another culling of features (like there was with GPUs in a time where DX9 was still a big deal) and a wave of new ones to lock in Win11/12/13 codependency. It's only going to get worse with time.

I can see a mild list preservation of:
MMX, Every SSE* ever made, AES, AVX, AVX2, DEP, FMA3, SHA

Then others may be piecemealed as customer specific features moving forward. I don't care about FMA4, don't want AVX-512 and I'm not paying for a dozen features that I didn't want in the first place. It's possible that x86-64 completely disappears in ARM64 builds and shows up in some coprocessor or multi-core module in some "high compatibility" server applications. These chip makers could choose to completely flip the chess board at any time and people will buy the new thing because it mysteriously performs at an extreme advantage.

That's the thing though. We're always going to have benchmarks.
 

GrumpyOtaku

New Member
Joined
Sep 17, 2021
Messages
6 (0.01/day)
I see "AI" as just abother bubble ready to burst someday. Just like the dot-com bubble, the housing bubble, and even going back to the Tulip Mania of the 17th century. Certainly there will be the few survivors after the bubble bursts, and it can be worthwhile to gain an understanding of it, and it will have it's uses, but it's already a grossly-misused and poorly defined buzzword. Throw the letters "AI" onto your product name, and suddenly it's all-new and better, even if you've done NOTHING to change it.

AI? Perhaps you might find some love for 愛 or あい ... (if anyone gets the joke)
 
Joined
Jul 5, 2013
Messages
25,700 (6.45/day)
I would even expand the subject, and say, that RTX, or RTRT cards should have been addon cards as well.
This is where I disagree. RTRT functionality is a direct graphics feature and directly effects gaming experiences. Being builtin to the GPU makes logical sense. However, AI functionality has never been and is not required for general computing, gaming or many other tasks.
 
Joined
Mar 26, 2010
Messages
9,829 (1.90/day)
Location
Jakarta, Indonesia
System Name micropage7
Processor Intel Xeon X3470
Motherboard Gigabyte Technology Co. Ltd. P55A-UD3R (Socket 1156)
Cooling Enermax ETS-T40F
Memory Samsung 8.00GB Dual-Channel DDR3
Video Card(s) NVIDIA Quadro FX 1800
Storage V-GEN03AS18EU120GB, Seagate 2 x 1TB and Seagate 4TB
Display(s) Samsung 21 inch LCD Wide Screen
Case Icute Super 18
Audio Device(s) Auzentech X-Fi Forte
Power Supply Silverstone 600 Watt
Mouse Logitech G502
Keyboard Sades Excalibur + Taihao keycaps
Software Win 7 64-bit
Benchmark Scores Classified
no, until now the process run through internet so there's no need to pay more for something that AI
for now AI label looks like become a label where the consumers pay more for something that they don't get in real life
 

bug

Joined
May 22, 2015
Messages
13,320 (4.04/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
This is where I disagree. RTRT functionality is a direct graphics feature and directly effects gaming experiences. Being builtin to the GPU makes logical sense. However, AI functionality has never been and is not required for general computing, gaming or many other tasks.
I believe AI is being included because implementations rely on compute and compute relies on shaders. Kind of hard to put AI on a separate chip or an add-on module.
 
Joined
Nov 11, 2020
Messages
432 (0.33/day)
Location
Earth, Solar System, Milky Way Galaxy, Local Group
Processor AMD Ryzen 7 5700X
Motherboard Asus TUF Gaming B550M-Plus (Wi-Fi)
Cooling Thermalright PA120 SE; Arctic P12, F12
Memory Crucial BL8G32C16U4W.M8FE1 ×2
Video Card(s) Sapphire Nitro+ RX 6600 XT
Storage Kingston SKC3000D/2048G; Samsung MZVLB1T0HBLR-000L2; Seagate ST1000DM010-2EP102
Display(s) AOC 24G2W1G4
Case Sama MiCube
Audio Device(s) Somic G923
Power Supply EVGA 650 GD
Mouse Logitech G102
Keyboard Logitech K845 TTC Brown
Software Windows 10 Pro 1903, Dism++, CCleaner
Benchmark Scores CPU-Z 17.01.64: 3700X @ 4.6 GHz 1.3375 V scoring 557/6206; 760K @ 5 GHz 1.5 V scoring 292/964
Personally I don't find any other usage for AI than making cool images.
I'm not taking this personally but I believe that AI-generated images or videos are totally worthless. Human creativity is always at the centre of a work of art. I would pay someone to draw pictures and create videos rather than buying hardware to apply AI methods.
I liked AI before, for AI really helps in translation and medicine for example, and they still do. But for now I hate it. Look what AI has done - Internet garbage, those things that we can't tell real or fake... Nowadays AI has created more chaos than it's really helped with things.
Probably there are three ways it's gonna end, a) AI gets out of control, B) human regulates AI effectively and they get along, or C) AI collapses or is abandoned after this hype... (metaverse remember?)
Idk, just talking gibberish

Edit: Tbh I'm quite disgusted by AI.
Take gaming industry alone for example.
What was it like before DLSS came out? Game developers tried their best to create detailed and beautiful models and optimise games' performance, and hardware developers had no ways but to bring real improvements to micro-architectures, which combined brought rapid progress.
And now? With AI tech bs? Developers have been given a way to be lazy... Just upscale everything! Just let AI guess where and when everything should be in gamers' screens! Why bother making the best models? Why bother optimising? Why bother upgrading? In the end DLSS/FSR/XeSS slows gaming industry down.
 
Last edited:
Joined
Apr 2, 2011
Messages
2,680 (0.56/day)
This is where I disagree. RTRT functionality is a direct graphics feature and directly effects gaming experiences. Being builtin to the GPU makes logical sense. However, AI functionality has never been and is not required for general computing, gaming or many other tasks.

So...help me here, because I used the exact same logic and came to the reverse conclusion. That was physics calculations required to simulate objects require a physx card...and it was great. Those limited number of games that used the accelerator card suddenly did things that were amazing...I'm specifically thinking destructible environments. The problem became whether you could assume a couple hundred dollars of GPU and of Physx card...and Physx became a thing basically as a function of general compute catching up and being fast enough to do everything.

RT does not fundamentally change the experience. By definition it's simply calculating how light diffuses and diffracts from surfaces...which is only useful when you're knee deep in the realism pipeline and making water and slick surfaces look really good is a matter of brute force rather than art style. The thing is...years before RT we had plenty of games where water was a thing and didn't require the processing power of a $2000 card to be pretty: Bioshock Water
Physx changed how games work. RT changes how some things look. For my money, the exclusion of RT from modern cards and the development of an RT accelerator card would be the best of both worlds...because the 99% of people who don't give a crap about RT could buy a better card with the saved money...and the 1% could get their APEX visuals in the few games which actually support it with a dedicated card.
-all of this said, the logic is a moon-shot. Selectively disabling RT unless you buy a specific add-in card would probably make it non-viable in development costs and prevent soaking its development costs with the base GPUs....not even mentioning the spaghetti code it'd require. Thank god I'm a grease monkey more than a code monkey-
 

bug

Joined
May 22, 2015
Messages
13,320 (4.04/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
@lilhasselhoffer RT is a bit like electric cars: nicer in the long run, but impossible to switch to right away.
Just like an EV is much simpler to build than a traditional ICE car is, working with RT eliminates some of the programming complexity*. This comes at a VRAM and GPU HP cost, but these burdens will become more manageable over time. Just like EVs, the gain will be there when we switch completely. In the meantime, GPUs doing both rasterization and RT are the hybrids we have to deal with as a compromise.

*The biggest advantage being illumination take just one pass, not several as it does for rasterization. Here's an ok-ish explanation (I'm sure you can find some better ones, too): https://users.aalto.fi/~lehtinj7/CS-C3100/2021/slides/15.1.rasterization.basics.pdf
 
Joined
Apr 2, 2011
Messages
2,680 (0.56/day)
@lilhasselhoffer RT is a bit like electric cars: nicer in the long run, but impossible to switch to right away.
Just like an EV is much simpler to build than a traditional ICE car is, working with RT eliminates some of the programming complexity*. This comes at a VRAM and GPU HP cost, but these burdens will become more manageable over time. Just like EVs, the gain will be there when we switch completely. In the meantime, GPUs doing both rasterization and RT are the hybrids we have to deal with as a compromise.

*The biggest advantage being illumination take just one pass, not several as it does for rasterization. Here's an ok-ish explanation (I'm sure you can find some better ones, too): https://users.aalto.fi/~lehtinj7/CS-C3100/2021/slides/15.1.rasterization.basics.pdf

So...you aren't convincing me...and you've not answered the basic question. What you've done is explain to me that Nvidia has bundled up the steps required to generate more realistic lighting behind a rather substantive calculation requirement....so that instead of iterating simpler equations you can do one much more complex calculation and get an answer. Note how much that is exactly like the physics object calculations.

I'd argue that you should call me out on this...and that I should then be able to answer you in 10 words or less. "Fourier Transform" is my two word response. With an infinite amount of terms we can match any point cloud to a regular repeating function...and Fourier Transforms take huge computational power to crank out. The same r=0.95 single sine function curve gives us something nobody will complain about...that isn't perfect...but also doesn't require specific new hardware and software to do.


Regarding your example....you've never really designed a car have you? There are off the shelf selections for virtually any component...whereas a good EV requires a lot of engineering, monitoring, and even redesign of basic mechanics (read: aluminum unibody) to work. For example, a basic H-Bridge can control a motor...unless you want regenerative breaking. Unfortunately you'll need a charging circuit for the battery, a battery/cell monitor circuit, a means to dump the regenerative energy back into the battery, oh and you'll need to design an efficient motor. A motor capable of being powered to start, having minimum draw while running, vent heat quickly, etc... So no. RT isn't "like how much easier it is to design an EV versus ICE." RT is forcing people to choose your hardware to use a feature you financially support to differentiate yourself from the competition...because you cannot really compete by offering performance anymore...

If it isn't clear...EVs aren't your average golf cart. A golf cart has a simple bi-directional switch mounted to a battery, so you've got 3 speeds. Reverse full, forward full, not connected. Believe it or not I may rail against EVs...but that's because once you understand everything compromised about them you'll never be able to look at them as "a simple air hockey table in a vacuum chamber" (Musk's description of the dead Hyperloop) ever again.
 

bug

Joined
May 22, 2015
Messages
13,320 (4.04/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
So...you aren't convincing me...and you've not answered the basic question. What you've done is explain to me that Nvidia has bundled up the steps required to generate more realistic lighting behind a rather substantive calculation requirement....so that instead of iterating simpler equations you can do one much more complex calculation and get an answer. Note how much that is exactly like the physics object calculations.
Yes, Nvidia has bundled... Because Nvidia invented RT **double facepalm**
 
Joined
Apr 2, 2011
Messages
2,680 (0.56/day)
Yes, Nvidia has bundled... Because Nvidia invented RT **double facepalm**

No. You are poorly informed...

Ray tracing is not new...what Nvidia made was hardware that would perform the specific calculations needed to trace diffusion and diffraction. If you are somehow unsure of this, I ask you to think back. Maybe a long time ago...maybe not. Think back to basic highschool physics. Think back to that diagram where a light ray impacts a glass surface at about 45 degrees, some of it diffuses on the surface, but part of that ray enters the glass, bounces off the opposite side, and exits the glass a little later. Do you maybe remember it now? Yes, ray tracing is simply diffraction and diffusion calculations performed as one...because in traditional rendering light is not treated as an object and therefore requires multiple passes to create depth effects.


If you want the wikipedia explanation it's here: Wikipedia - Ray Tracing

That said, this is exactly the same as physics. Remember when a body was considered rigid? That meant you had a bunch of sticks around an explosive barrel, they'd fly away as solid sticks. Then Physx came along and treated those sticks as either deformable bodies or destructible bodies. This meant those sticks around the explosive barrel didn't just bounce around, they blew up into little tiny pieces, then the calculation for particles could start taking over as the dust settled.


Ray tracing does none of this. It literally only provide more photorealistic behavior for light rays...which is great if you want that. It's less great if you can tolerate water looking like Bioshock and being entirely immersive...because I don't need the next COD operator to have slightly prettier grease paint on their face that I have to spend $10 on top of the $70 game price...so that my latest $2000 purchase is justifiable in any way....because something at half the price has similar raster performance so there must be a reason I spent all of that money...
 

Keullo-e

S.T.A.R.S.
Joined
Dec 16, 2012
Messages
11,312 (2.70/day)
Location
Finland
System Name 4K-gaming
Processor AMD Ryzen 7 5800X
Motherboard Gigabyte B550M Aorus Elite
Cooling Custom loop, Corsair ML/LL fans
Memory 48GB Kingston Fury DDR4-3200
Video Card(s) Asus GeForce RTX 3080 TUF 10GB @ +150/700
Storage ~4TB SSD + 6TB HDD
Display(s) Acer XV273K 4K120 + Lenovo L32p-30 4K60
Case Corsair 4000D Airflow White
Audio Device(s) Asus TUF H3 Wireless
Power Supply EVGA Supernova G2 750W
Mouse Logitech MX518 + Asus TUF P1 mousepad
Keyboard Roccat Vulcan 121 AIMO
VR HMD Oculus Rift CV1
Software Windows 11 Pro
Benchmark Scores It runs Crysis remastered at 4K
Isn't the GPU RT just real-time GPU-accelerated raytracing? As like said, raytracing isn't anything new by itself.
 

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
7,837 (1.26/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Short answer, probably not, current "AI" is a bit of a misnomer, and used far too liberally to give good understanding of what on earth they mean by just using the term.

I do think certain tasks merit from hardware acceleration however, take Nvidia with the RTX 20 series, I wouldn't have been a buyer hungry to buy at launch based on it having "AI CAPABILITYtm", I wanted to see the fruit of that hardware acceleration bear out some. Come 30 series launch, that capability was starting to bear fruit and I saw it fit to choose the hardware with the capability over hardware lacking it, in a cost matched scenario, and for me personally, that was a great choice.

The short answer is basically that the question is far too vague to accurately answer with the given options, there's such a large dose of "it depends" to consider. So very many variables.
 
Joined
Jan 14, 2019
Messages
10,141 (5.16/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
I need my computers for gaming, video playback and web browsing. I have no use for AI.
 
Joined
Aug 23, 2013
Messages
459 (0.12/day)
Consumer uses for AI are very limited or not that many. So I don't want to pay for something that by the time it became useful is going to be slow. It's almost as how RT currently is.
 
Joined
Jun 25, 2020
Messages
104 (0.07/day)
System Name The New, Improved, Vicious, Stable, Silent Gaming Space Heater
Processor Ryzen 7 5800X3D
Motherboard MSI B450 Tomahawk Max
Cooling be quiet! DRP4 (w/ added SilentWings3), 3x SilentWings3 120mm @Front, Noctua S12A @Back
Memory Teamgroup DDR4 3600 16GBx2 @18-22-22-22-42 -> 18-20-20-20-40
Video Card(s) PowerColor RX7900XTX HellHound
Storage ADATA SX8200 1GB, Crucial P3+ 4TB (w/adaptor, @Gen2x1 ), Seagate 3TB+1TB HDD, Kingston SA400 512GB
Display(s) Gigabyte M27U @4K150Hz, AOC 24G2 @1080p100Hz(Max144Hz) vertical, ASUS VP228H@1080p60Hz vertical
Case Phanteks P600S
Audio Device(s) Creative Katana V2X gaming soundbar
Power Supply Seasonic Vertex GX-1200 (ATX3.0 compliant)
Mouse Razer Deathadder V3 wired
Keyboard Non-branded wired full custom mechanical keyboard from my brother's leftover
So far, whatever the marketing people in those so-called AI tech companies in charge failed to sell me anything on the PC side that required any specific AI capabilities.
On the smartphone side, I don't think I have make use of any of the AI stuff in my iPhone other than the camera.

I'm more interested in GT Sophy in GT7/PS5 than all the AI related crap outside of that. That almost baited me to buy a PS5. Still on a PS4Pro though.
And I don't think I will play anything else on PS4/PS5, so that will be an extremely costly purchase for a single game.
 

bug

Joined
May 22, 2015
Messages
13,320 (4.04/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
For a tech-enthusiast forum, it's quite surprising to see so many people declaring they don't care much about a new tech development.
And I don't mean people saying they don't want to pay more for AI, I mean people saying they don't care about AI on their PCs at all.
 
Joined
Jan 14, 2019
Messages
10,141 (5.16/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
For a tech-enthusiast forum, it's quite surprising to see so many people declaring they don't care much about a new tech development.
And I don't mean people saying they don't want to pay more for AI, I mean people saying they don't care about AI on their PCs at all.
If someone can show me any use case for AI that makes my life 0.001% better, then maybe I'll start caring about it.
 

bug

Joined
May 22, 2015
Messages
13,320 (4.04/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
If someone can show me any use case for AI that makes my life 0.001% better, then maybe I'll start caring about it.
You mean you've never used ChatGPT?
 

64K

Joined
Mar 13, 2014
Messages
6,255 (1.68/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) MSI RTX 2070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Dell 27 inch 1440p 144 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
You mean you've never used ChatGPT?
I haven't. I've seen some crazy shit advice posted from asking for advice from generative AI. The last one I read was several people asked what was the best way to pass a kidney stone quickly and the advice was to drink plenty of urine.

 
Top