I dabble with Photoshop CS5, and Pinnacle Studio 15 HD, but not professionally. I am rebuilding my rig after seven years, and although I really like my existing Viewsonic VX2035wm, regardless of only a screen RES, of 1280 x 1024, some folks are telling me I should get a new 1920 x 1080 HD/LED 24" Monitor to see all of the actual colors I'm seeing, using these two software titles.
If it's absolutely necessary for me to get an 1920 x 1080 HD, I'm looking at these few:
1) Viewsonic VA2446M 24" Full Monitor, with Dynamic Contrast ratio of 10,000,000:1, NOT an Energy Star, 5MS refresh rate, for $160.00 at Micro Center. Three year warranty on parts/labor.
2) ASUS VX238H 23" HD LED, for $179.99, also at Micro Center. 1 MS, Energy Star,80,000,000:1 Contrast ratio,16.7 Million Color support, also a 3 year parts and labor.$179.00.
3) LG EN33TW24" Slim LED 5MS,Color support- 16.7 million, HDCP Support, Energy star, with only ONE year parts /Labor, for $149.99, at Micro Center.
I haven't rebuilt yet, but my existing mobo is a MSI MS4375 P35 Platinum 775 socket, and although the manual ststed this board came with it's own DB 15 pin VGA on the board, I never saw it, (???Ummmm?) so I had to install my own Nvidia GeForce 8600GT.
Soon, I will be upgrading to the new i7 4770 1150 3.4GHz CPU with the HD 4600 IGP on it's chip, and some have told me I probably would NOT need a dedicated GPU, like for instance, a GTX 650Ti, or something else, If I do not game, and certainly do not over clock.
Some also tell me I would need the GPU, so that even watching Youtube videos will make a huge difference, better than the i7's graphics.
Thank you for any input!
+ Reply to Thread
Results 1 to 18 of 18
Read this first:
Have not checked if any of your choices are IPS but I would suggest that is the first port-of-call. I upgraded, out of necessity (the other died), my monitor 10 months ago and picked up an AOC IPS for IIRC £140. Of course there are better than that and if you are serious about your photo-work then you need a better monitor. And you will need to spend a lot more than what you propose.
I am no fan of using built-in graphics which will mean that you share your memory.
Why so low-budget if you're "serious"? Got a feeling CS5 might be going to waste (you can drop Pinnacle = Not serious). I'm with DB83: go with a good IPS front panel. Then calibrate the thing properly and get your money's from CS5. http://www.tftcentral.co.uk/reviews/eye_one_display2.htm. If you're serious, I mean.
HP for starters: http://www.tftcentral.co.uk/reviews/hp_zr2440w.htm . NEC for moving up a bit: . http://www.tftcentral.co.uk/reviews/nec_p232w.htm. Of course, you can go further up the price ladder.
Last edited by sanlyn; 19th Mar 2014 at 03:40.
Pinnacle 15 will handle HD fine, but in order to properly edit in it you do need a monitor that will handle HD at full res. In addition, when I got my new Canon HD video cam, the AVCHD would run with tons of dropped frames, jerky, artifacts, etc. using my integrated video. I got an Nvidia card (not ultra high end, but top end of the middle price bracket) and suddenly all the dropped frames and artifacts stopped and everything ran smoothly. If you are an amateur or serious amateur, Pinnacle 15 will allow you to do some pretty impressive stuff. If you are totally serious about wanting pro quality best to go with other better grade software.
Went to Fry's to look at Monitors. I saw an ASUS IPS 22" 1980x1200, for $150.00. Look pretty awesome. They had others too, but I will delve deeper into this after I'm done with the build.
Sanlyn, you told me I should "Drop Pinnacle"? I would love to have a much better video editing program, and although it can be very problematic at times, it's all I can afford to burn family videos of sports, B-Days, and Holidays.
DB83, You said CPU and GPU share memory? I thought when you install a GPU, like GTX650, 770, etc., in the BIOS, you choose which Graphics you want to use.
So if I buy an Nvidia GTX GPU, I'll never use any of the CPU features?
You're at the point where you're better off with Pinnacle.
Last edited by sanlyn; 19th Mar 2014 at 03:40.
You said the jerky stuff stopped after you got a GPU, (I'm guessing you were talking about using Pinnacle)
which integrated CPU were you using?
With the new intel Haswell 4th Gens, a lot of people are saying that with the i7 4770's HD 4600 are
phenomenal at handling video, I still think if I get the Nvidia GTX650, (2GB, GDDR5) it would be better
than my i7 graphics, Even if I don't Overclock, or Game.
New Soon to be Puter~
MSI B85-G41 LGA 1150
16 GB DDR3 1600 (12800)
Samsung 840EVO 250GB SSD
WD 2TB HHD, WD 500GB HDD
Intel 4770 i7 Haswell 3.5GHz
Unsure about which GPU, or Display to get.
Sounds like a definite HD LED, or IPS, 22-24".
The IGP in the i7 4770 is sufficient for almost everything except high end gaming. You can always add a separate graphics card later if you find it's not enough for your applications.
You're obviously not "serious" about the monitor. Just look for one with a good black level and backlights that can be dimmed sufficiently for use indoors (the way they get really high "dynamic" contrast ratios is to use backlights bright enough to shoot down incoming ICBMs), then roughly calibrate it yourself with some test patterns.
Tip: ignore the hype on contrast ratios. They are fantasies. A good name-brand monitor has contrast levels that average from 750:1 to 1100:1 or so in actual use. No one can possibly look at a millions-to-one contrast ratio without being blinded.
Last edited by sanlyn; 19th Mar 2014 at 03:41.
Static contrast ratio (almost never specified by manufacturers) is the difference between the darkest and brightest parts of the picture the monitor can display at the same time. This is usually measured with a black-and-white checkerboard pattern. This value usually falls between a few hundred to a few thousand.
Dynamic contrast ratio doesn't require that the two extremes be measured at the same time or even with the same settings. The dark image may be measured with the backlight turned all the way down, even off, displaying a completely black image. The bright level may be measured with the backlight turned all the way up, displaying a totally white image. Dynamic contrast ratio is an almost useless number and ranges up to millions or even "infinite".
What you really want is good black levels (almost never specified by manufacturers) so blacks are really black, not dark grey or dark blue (blue light is the hardest for an LCD to block); and a decent static contrast ratio in the high hundreds range (again, almost never specified by manufacturers).
Note that it's possible to generate high static contrast ratios without having very dark blacks -- by having very very bright brights. A picture that bright may be good for use outdoors but is too bright for comfortable viewing in a typical home or office environment.
Last edited by jagabo; 5th Mar 2014 at 10:31.
Sorry we have to keep going down this road to my decision about whether to solely use my onboard HD 4600 i7's IGPU, or buy a dedicated GPU...Just don't want to step in anything by making the wrong decisions, when I can take the money that I would have spent on a decent new GPU, then instead, buy a new IPS 22-24 HD Monitor instead.
I am serious now about a new monitor, and I went to Fry's to look at the IPS 1920x1200. I saw an ASUS IPS 22" 1980x1200, for $150.00. Looks pretty awesome. They had others too, but I will delve deeper into this after I'm done with the build. While I was there, I also bought a Raidmax Inter Connect 4 Port USB Hub.(2x2.0 USB/2x3.0 USB) Got home and did not notice it's only 4" wide, so I have to either go back and return it, then use the USB 3.0 in the rear I/0, or buy the front bay 5.25" adaptor, and use the fronts.
I realize that Pinnacle studio is frowned upon when there are so many better video editing programs out there,( Adobe Premeire, Sony Vegas, etc, which are more expensive than what I paid for my Studio 15HD, and although it can be very problematic at times, it's all I can afford to burn family videos of sports, B-Days, and Holidays. So now, it works for me.
Please let me understand this further....I've heard that the IGPU on the CPU, and also the GPU can share memory? i7's HD 4600 works better for some programs, where the dGPU can do other functions better, in others.
DOES THE BOARD USE BOTH IPGU, and the GPU AT THE SAME TIME? EITHER, OR?
I thought when you install a GPU, like for instance, a GTX 650, GTX 770, etc., one would have to change which graphics to use in the BIOS?
I still 'think' if I get the Nvidia GTX 650, (2GB, GDDR5) it would be better served than my i7 graphics, even if I don't overclock, or game, but at this point, I still have no idea. Heck, someone even mentioned to me, to still use my old Nvidia 8600GT, until I get a new one. really? Also added this 8600GT might even be better than my IGPU HD 4600, on the i7...another REALLY? ( But I would step up to the newer cards anyway, if I need one)
Thank you for your patience!
Windows 7 and above, plus some programs in them, in addition to current BIOSes, allows simultaneous use of on-board and PCIe graphics. There are some free and payware programs that manage what displays what: one monitor might be designated boot monitor, and/or main program monitor; another the edit view (of the NLE you are using, should that program be sophisticated enough to allow it); another a web browser, etc. You can even d/l wallpaper to span all monitors in between your awesome NLE sessions. Mix and match and take the adventure. Installing the nVidia can never hurt, so stop being a kvetch and oh get it already.For the nth time, with the possible exception of certain Intel processors, I don't have/ever owned anything whose name starts with "i".
Take a look at the ASUS PA248Q 1920x1200 monitor, and its smaller variant, the PB238Q 1920 x 1024 which is similar but lacks the factory color calibration. Both have IPS screens, excellent viewing angles, and multiple inputs (VGA, DVI, HDMI, and DisplayPort). They are pricier than the one you mentioned but you might prefer them. I'm staring at a PA248Q right now.
WELL..I WENT AHEAD AND DID IT!
I was at Fry's last night, to get a few LED 120mm case Fans. Just happened to meander over to the Graphics Isle, and saw the EVGA GTX-750 1 GB/GDDR5, for $135.00 with a $10.00 rebate.
All of these past thread posts going back and forth asking nice, seasoned folks like you, whether or not I would benefit from a dedicated GPU, or be simply satisfied using the 4770 CPU's HD 4600 Graphics, I finally went for the card. since I was getting advice from about half of you to get the card, and the half telling me just use the IGPU.
Some decisions have to be made on instinct, and having the GeForce 8600GT for the past seven years, I think overall, I would be better off having this card, than not having it. Besides, maybe someday I may want to a little small time gaming.
(?) When I had set up my 8600GT, in the BIOS I had to direct which graphic controller to use, and I chose the PCI-e. This time with the i7 4770, and the Evga GTX750, will it ask me again, will it auto select, or will they automatically be used together?
Thank you for the kinds words and for all the helpful advice. You all are awesome!! My next purchase?....Either a IPS LED HD 1980x1050, or 1980x1200, and maybe a BD Burner
I think you made a mistake by purchasing a video card before determining if the IGP on the 4770 is sufficient for your needs. Very few programs use the GPU filtering functions effectively. The GPU video encoders deliver poor quality compared to CPU encoders. And with a fast CPU the the GPU encoders aren't any faster. Intel's Quick Sync h.264 encoder is the exception (it's faster than the CPU but still delivers inferior quality) -- and you don't need a nvidia GPU to use that. For Desktop applications you'll see no difference in performance between the 4770 IGP and a GTX-750.
For the nth time, with the possible exception of certain Intel processors, I don't have/ever owned anything whose name starts with "i".
rivrbyte.... sorry for being late in answering, life interfered. I see you got a new vid card but I will answer your question anyway. BTW, my computer specs are under my profile; just click on my name / view profile. As long as I was working with an older camcorder with low res all was fine using my integrated video. My new camcorder (Canon) shoots AVCHD and, while Pinnacle edited it with no problems, any playback or preview testing was jerky with tons of dropped frames and hesitations. My new card instantly solved that.
While I was there, I also bought a Raidmax Inter Connect 4 Port USB Hub
Pinnacle: No, Pinnacle is not top of the line, but you aren't paying top of the line money either. The thing I do not like is that they have a "payfer" method for additional features that should have been stock, for example more title templates, various filters, etc. If you are going to purchase, get the Ultimate version for a bit more. In the past Studio was as buggy as a swamp in July, but that was version 10. I have found the version 15 (what I use) to be pretty much problem free. It is quite powerful once you get familiar with it. I have tons of titles, transitions, I can cut & clip freely, do voice-overs, etc. I have a collection of music clips I downloaded from Youtube, videos, recorded music, etc. and edited in Audacity, then imported into Pinnacle for background music, etc. There is a lot more it will do that I have never needed to investigate such as create videos for Youtube and such. I have done several business videos in it, not for broadcast but conferences, meetings, etc. and the results were great (IMHO). Anyway, just my take on it.
For now I have the 1280 x 1024 Monitor, I will upgrade hopefully soon, to maybe a IPS LED 22"-24", up to 1200, or more expensively up to 1400 RES. Reading more articles about the CPU's 4600 HD Graphics outperforming GPU's overall, and processing 1080p video. I'm still hearing I do not need a GPU, and the onboard 4600 HD will be a good fit, for my light video-editing and Photoshop work.
"If you ever go 1080p or beyond; get a 770 with 4GB."
That's why I'm still somewhat confused, I sent Seasonic an email asking them what they thought, since they made the darn thing, and they sent me a *reference link:
"Yep, you need at least get a 575W, however, our 550W will have about 600W max output."
Sounds like I 'd be living on the edge, if I don't get at least a 600w, or 650w PSU. I know you all will say just use the 550, but what would happened if it was still not enough, (not a good scenario) or I upgrade to a bigger GPU (for what, I have no idea) in the future. Then I would be stuck with this 550W. So better now, BEFORE I open the box and use it, I should still think about if I still NEED a BIGGER WATTAGE PSU.
I have STUDIO 15 HD ULTIMATE, I do have a Canon Digital that shoots AVCHD, so the consensus is for me to exchange the GTX750 for a GTX 770 with 4GB memory and up my RAM to 32 GB....but I'm still concerned about my 550W PSU not being enough power.