4K Monitor

First of all, get rid of that damned credenza, man! LMAO OK, what is a display adapter? I am not the brightest bear in the woods.
My description is probably incorrect. My NVIDEA Control Panel refers to it as a Graphics Card. I've also heard it called a Graphics Adapter b ut before we became more sophisticated we just called it a video card. Mine is an NVIDEA Ge Force RTX 2060 Super. I think they left off one word from the model of the card. The word Expensive should appear after the word Super. It looks like you can now buy one of these cards for $729.00. I hope this sucker doesn't fail!
 
You guys keep costing me money. LOL Found a Asus ProArt PA278CV on sale for US$387, thank you very much. And I have a credit at that store from the A7M III which dropped in price US$300 right after I bought it. Now a card. What's a good photo editing card? Do not care about gaming. Thanks for the monitor help.
Haha. Can't help too much on the card, they have long been updated since my one, but NVidea is what I have, one with 4gb dedicated RAM, which is very useful with the giant A7R4 Files. Glad you found a monitor though.
 
The propeller heads over in Linux land have warned me away from NVidea cards. I can work in the W10 side of the house but am mainly here in Linux. Not sure about NVidea in W10. ~US$700 is nuts.
 
My description is probably incorrect. My NVIDEA Control Panel refers to it as a Graphics Card. I've also heard it called a Graphics Adapter b ut before we became more sophisticated we just called it a video card. Mine is an NVIDEA Ge Force RTX 2060 Super. I think they left off one word from the model of the card. The word Expensive should appear after the word Super. It looks like you can now buy one of these cards for $729.00. I hope this sucker doesn't fail!

Yeah, a video card. I bet by that name they are cheaper.
 
First of all, get rid of that damned credenza, man! LMAO OK, what is a display adapter? I am not the brightest bear in the woods.
If you are using a laptop or a Mac you don't need to worry about a display adapter, aka video card, graphics card, graphics adapter, and that damn expensive part (the fanciest graphics cards can cost more than the rest of the PC).
 
The propeller heads over in Linux land have warned me away from NVidea cards. I can work in the W10 side of the house but am mainly here in Linux. Not sure about NVidea in W10. ~US$700 is nuts.

First off, it's Nvidia, not Nvidea.

I'm surprised Linux people steered you away from Nvidia - I thought Nvidia was more Linux friendly that AMD (the other big name in graphics).

The top of the line video cards are into thousands of dollars in my country, and well over a thousand in the US, so US$700 is considered reasonable.

Nvidia also make cards specifically for professional graphics work, like the Quadro series. They used to be seriously pricy, although I haven't looked at them for a few years.
 
First off, it's Nvidia, not Nvidea.

I'm surprised Linux people steered you away from Nvidia - I thought Nvidia was more Linux friendly that AMD (the other big name in graphics).

The top of the line video cards are into thousands of dollars in my country, and well over a thousand in the US, so US$700 is considered reasonable.

Nvidia also make cards specifically for professional graphics work, like the Quadro series. They used to be seriously pricy, although I haven't looked at them for a few years.

Call it duck if you want. I have been warned off the older cards as they sometimes have problems. Likewise AMD. The more recent Linux kernels seem to have solved this. And the folks over on the Linux forum seem to think that the card I have, Intel UHD Graphics 630, is just fine.

I'm unsure on how to spend the money I have saved. Riotous living in Las Vegas sounds good.
 
If you are using a laptop or a Mac you don't need to worry about a display adapter, aka video card, graphics card, graphics adapter, and that damn expensive part (the fanciest graphics cards can cost more than the rest of the PC).

Yes, I can get an M1 box for US$900, less than many cards. But then I am shopping for software. Gillette, cheap razor and the kill you on the blades. I'll stay with PC's and the Linux OS. W10 on the other side of the house. W10 came with the box.
 
Call it duck if you want. I have been warned off the older cards as they sometimes have problems. Likewise AMD. The more recent Linux kernels seem to have solved this. And the folks over on the Linux forum seem to think that the card I have, Intel UHD Graphics 630, is just fine.

I'm unsure on how to spend the money I have saved. Riotous living in Las Vegas sounds good.

As a card-carrying pedant, I am obliged to point out that the Intel graphics component is not a card, but built into the CPU - it does not plug in, so it's not a card. It may get called "a graphics card" by the ignorant, but it's not. You won't get as much graphics power from an Intel graphics option.

I guess riotous living in Las Vegas may result in some interesting images, but they may well be taken by others, so I guess they'd have to process them, rather than you o_O

The M1 chip incorporates graphics in a similar way to the Intel, but is reportedly considerably more powerful. So much so that Adobe based their Super Resolution feature upon them. It will be interesting to see what directions it takes. I've used both PCs and Macs in the past, and I've used PhotoShop extensively on both (they are equivalent). I'm watching the M1 developments with some interest.
 
As a card-carrying pedant, I am obliged to point out that the Intel graphics component is not a card, but built into the CPU - it does not plug in, so it's not a card. It may get called "a graphics card" by the ignorant, but it's not. You won't get as much graphics power from an Intel graphics option.

I guess riotous living in Las Vegas may result in some interesting images, but they may well be taken by others, so I guess they'd have to process them, rather than you o_O

The M1 chip incorporates graphics in a similar way to the Intel, but is reportedly considerably more powerful. So much so that Adobe based their Super Resolution feature upon them. It will be interesting to see what directions it takes. I've used both PCs and Macs in the past, and I've used PhotoShop extensively on both (they are equivalent). I'm watching the M1 developments with some interest.

The M1 is a major game-changer. Intel and AMD got caught with their pants down. I'd bet they knew something about it before it was released as there is a lot of gossip in Silicon Valley. But they can no longer compete. When that chip gets set up for PC's things will really get exciting. I think the chip, M1, is wonderful. Imagine how I would feel if I liked Apple.

As for the Intel graphics, yes, it is on the chip rather than on a separate card in a slot. I am not playing games so processing power is not important. So long as it can render a bunch of accurate colors in high density, and it can, I will be happy. If I am unhappy I will just get a card, a real card. ;o)

As for pedantry, I used to be there, too. But the general level of education today caused me to overwork. Complete misunderstanding of homonyms, the subjunctive, nouns vs adjectives and so many other rhetorical failures caused me to throw up my hands in despair and retreat to my cave. You are a braver person than I.
 
And don't let the size be an excuse for not going 4K 32": if you can fit a keyboard and a mouse in your desk (edit: and my standing desk is as small as it gets) you can fit a 32" monitor too

21-11-17 09h07m50s #0063 S.jpg
  • ILCE-1
  • 85mm F1.4 DG DN | Art 020
  • 85.0 mm
  • ƒ/1.4
  • 1/125 sec
  • ISO 250
 
Last edited:
We may not all be chasing the same rabbit. A 27" is currently a pretty good size. But I will check it out. SONY Trinitron came to market as a 12" screen and was dazzling. I can easily watch a small screen quite easily and do watch TV on my Amazon Kindle Fire when on the road. Likewise a mono table radio can work for a symphony. This is not to say I do not enjoy a large screen TV or a killer stereo. I do but am not shackled to the device. And I need that spare change for black cigars. strong drink, fast cars and dangerous women.

The ASUS Pro Art 32" starts at US$800 and runs up to US$4500. That's a lot of cigars.
 
This is why I am considering the Mini, because you can add whatever screen you want, and price wise it's not so bad. I started on Macs, and still have a 10 year old iMac, which I don't currently use, but agree that being tied into the locked architecture is a pain.
I would not go with the M1-Mini: 2 TB ports is not enough. 1 used for an external TB SSD working drive, another for a card-reader. Then you want to do a tethered session, connect an archival HDD drive and you are already out of ports and into unplug/plug hell.
Waiting for M1max-Mini with 4x TB ports to upgrade from MacMini-2018 (the TB e-GPU experience with this one was/is a complete disaster).
 
I would not go with the M1-Mini: 2 TB ports is not enough. 1 used for an external TB SSD working drive, another for a card-reader. Then you want to do a tethered session, connect an archival HDD drive and you are already out of ports and into unplug/plug hell.
Waiting for M1max-Mini with 4x TB ports to upgrade from MacMini-2018 (the TB e-GPU experience with this one was/is a complete disaster).

I'll avoid Apple/Mac as I do not want to buy software for that platform. I have enough software on W10 and Linux. And then there is restricted software offerings to deal with. The whole Apple/Mac deal is objectionable and I am not sure it is better other than the new M1. They'll eat everybody's lunch with that. You can be sure Intel and AMD's legal departments are looking for loopholes in the patents. And other companies, too. This is an entirely new direction and a bridge to the newer technology whose name I forget but has the state where one switch can be positive and negative. Woohoo! I hope I live long enough to see some of this.
 
I would not go with the M1-Mini: 2 TB ports is not enough. 1 used for an external TB SSD working drive, another for a card-reader. Then you want to do a tethered session, connect an archival HDD drive and you are already out of ports and into unplug/plug hell.
Waiting for M1max-Mini with 4x TB ports to upgrade from MacMini-2018 (the TB e-GPU experience with this one was/is a complete disaster).
I never use tethered. I literally just need the internal HD, and one external to back up to, so import on card reader and eject it still leaves me a port. It would work fine for me.
 
I have a Dell 27" 2k monitor and for my needs that's what I use and it's pretty darn good. Has great color and brightness. to a gaming monitor but then I don't game. Best part is, it's USB-C (Thunderbolt) and charges my Mac at the same time so less cords to deal with.
 
I would not go with the M1-Mini: 2 TB ports is not enough. 1 used for an external TB SSD working drive, another for a card-reader. Then you want to do a tethered session, connect an archival HDD drive and you are already out of ports and into unplug/plug hell.
Waiting for M1max-Mini with 4x TB ports to upgrade from MacMini-2018 (the TB e-GPU experience with this one was/is a complete disaster).
I'm very happy with the M2 Max Mac Studio - 4 thunderbolt ports on the back, 2 USB-C on the front. To get 6 thunderbolt ports requires the M2 UItra, and I considered that for a bit and decided "Nah!".

I had it driving a BenQ SW320 and a BenQ SW321C for a while, and that worked OK, but I happened upon an SW321C with a single dead pixel for just over half price, so I replaced the SW320 with a second SW321C - the dead pixel is over near the left side, and I run Bridge on that monitor, and Photoshop on the perfect one - works very nicely.
 
Back
Top