VideoHelp Forum
+ Reply to Thread
Results 1 to 27 of 27
Thread
  1. Member
    Join Date
    Jan 2004
    Location
    United States
    Search Comp PM
    Hi, I currently have this CPU, board and memory:
    • Intel Core i7-2600K Sandy Bridge 3.4GHz
    • ASUS P8Z68-V PRO Motherboard
    • G.SKILL Ripjaws Series 8GB (2 x 4GB) 240-Pin DDR3 SDRAM DDR3 1600

    I'd like to upgrade the RAM to add more. Question is, should I add two more 8GB sticks, or should I replace the old memory sticks with new? I figure if I add to it, it should be fine because I already have the fastest memory that the board supports. Am I correct here?
    Quote Quote  
  2. I've had mixed results adding to existing memory. Sometimes the different pairs don't play well together. I'd go with a new 16GB kit, although if I could scrounge a similar pair of 4GB sticks, I would.
    Quote Quote  
  3. Member
    Join Date
    Jan 2004
    Location
    United States
    Search Comp PM
    Even if I buy the same brand of memory with same specs, just 8GB sticks instead of 4GB? Still wouldn't do it?
    Quote Quote  
  4. Just add 2 more compatible sticks. This almost always works fine in desktops.

    Now, there might be a minor difference in speed when accessing more ram (E.g. 2x8GB might access faster than the 2x4GB pair).

    If you're really worried about that, just install the 2x8GB in the first two spots/bank, 2x4GB in the second spot/bank.

    32GB is the max.

    Honestly, nothing difficult or bad as long as the ram is compatible.

    Due to the age, I'd match speed of the ram to avoid any issues with ram running at different speeds in different slots.
    https://www.asus.com/Motherboards/P8Z68V_PRO/HelpDesk_QVL/

    You can upgrade to a 3rd gen i7, too, but usually, if you need more gpu power, a nvidia 1050ti mini or better (depending on your power supply) will do the trick nicely in giving this old system a good boost without requiring a new power supply that has the power and card connector for higher power cards like the 2080.
    Quote Quote  
  5. aBigMeanie aedipuss's Avatar
    Join Date
    Oct 2005
    Location
    666th portal
    Search Comp PM
    if at all possible i'd consider waiting until you can upgrade mb/cpu/ram. the 2600k is a nice processor but it's getting a little bit long in the tooth. and there's no microcode(firmware) fix for a couple cpu exploits....
    --
    "a lot of people are better dead" - prisoner KSC2-303
    Quote Quote  
  6. Member
    Join Date
    Jan 2004
    Location
    United States
    Search Comp PM
    3rd gen i7? You mean there's a better CPU I can put into this board? I think this was the best one at the time, like 2012 or 13 I think it was..
    Quote Quote  
  7. aBigMeanie aedipuss's Avatar
    Join Date
    Oct 2005
    Location
    666th portal
    Search Comp PM
    no i said upgrade mb/cpu/ram, i.e. motherboard, cpu and ram, all at the same time. they all are many generations past the 2600k. p.s. i also have a 2600k box still working, but waiting for upgrading.
    --
    "a lot of people are better dead" - prisoner KSC2-303
    Quote Quote  
  8. Member
    Join Date
    Jan 2004
    Location
    United States
    Search Comp PM
    ahhh I was hoping to not have to upgrade the board too.. What I really want is a bigger case, everything is so darn cramped in the case I Have now.
    What would you suggest for a new board, CPU and RAM combo for doing HD (not 4k) video work? I also have an RX 590 video card recently purchased.
    Quote Quote  
  9. Dinosaur Supervisor KarMa's Avatar
    Join Date
    Jul 2015
    Location
    US
    Search Comp PM
    16GB of RAM is a cheap upgrade and certainly worth it if you have no plans on upgrading the CPU. I have two machines that have roughly the same power CPU, but one has 8GB while the other has 16GB. The 16GB is certainly worth it, especially when I'm still completely on HDD.
    Quote Quote  
  10. aBigMeanie aedipuss's Avatar
    Join Date
    Oct 2005
    Location
    666th portal
    Search Comp PM
    me, i start with the biggest full sized cases i can find. like the corsair haf932 i can put 7 5.25in hdds in it along with 3 5.25in optical drives.

    Image
    [Attachment 51615 - Click to enlarge]



    i'm tempted by the new amd ryzen 7s like the 3800 or ryzen 9s like the 3900 but never had much luck with amd builds in the past. imo amd threadrippers seem a throwback to their overheating days, but intel has been snoozing for a couple years. don't know what i'd go with right now.
    --
    "a lot of people are better dead" - prisoner KSC2-303
    Quote Quote  
  11. 3rd gen i7 can be used.
    See prior link to asus compatibility.
    No significant speed increase over 2nd gen i7, gpu Intel hd 4000 is about 40% faster (but easier just to drop in a cheap 1050ti to get far faster gpu).

    Personally, I wouldn't bother.
    Better nvidia gpu
    More ram
    Ssd drive

    These are the upgrades that's the best for an old system like yours. It's still very good for everyday use.

    If you're thinking render monster....
    https://www.pugetsystems.com/labs/articles/Premiere-Pro-CPU-performance-Intel-Core-X-1...-3rd-Gen-1629/

    Feel free to go nuts $$$$
    Quote Quote  
  12. ..
    Quote Quote  
  13. Member
    Join Date
    Jan 2004
    Location
    United States
    Search Comp PM
    Do you think I made a mistake buying the RX 590? I don't do any gaming, just strictly encoding.
    But I was also told that CPU encodes produce better quality.
    I do have a SAMSUNG 840 EVO SSD for the system drive
    Quote Quote  
  14. Dinosaur Supervisor KarMa's Avatar
    Join Date
    Jul 2015
    Location
    US
    Search Comp PM
    Originally Posted by sdsumike619 View Post
    Do you think I made a mistake buying the RX 590? I don't do any gaming, just strictly encoding.
    But I was also told that CPU encodes produce better quality.
    I do have a SAMSUNG 840 EVO SSD for the system drive
    Yes the RX 590 is overkill for someone who does not game, if you just wanted the video encoder chip you could have gone for the RX 460-560. x264 and x265 are CPU based encoders and they do produce better encoding than their GPU counterparts.
    Quote Quote  
  15. Member
    Join Date
    Jan 2004
    Location
    United States
    Search Comp PM
    https://www.newegg.com/sapphire-radeon-rx-590-100415p8gl/p/N82E16814202333?Item=N82E16814202333
    is what I have now. Are you saying that a 1050ti would have been a better choice?
    Quote Quote  
  16. Member
    Join Date
    Jan 2004
    Location
    United States
    Search Comp PM
    I just got off the phone with G.Skill and told them I wanted to upgrade the memory by adding two 8GB sticks to my existing two 4GB sticks. The guy on the phone said they don't recommend mixing different size memory sticks in the same board... Is there any truth to that? as I was planning to buy this:

    https://www.newegg.com/g-skill-16gb-240-pin-ddr3-sdram/p/N82E16820231568?Item=N82E16820231568 as it's the same brand and specs of my current memory.
    Last edited by sdsumike619; 22nd Jan 2020 at 12:04.
    Quote Quote  
  17. Dinosaur Supervisor KarMa's Avatar
    Join Date
    Jul 2015
    Location
    US
    Search Comp PM
    Originally Posted by sdsumike619 View Post
    https://www.newegg.com/sapphire-radeon-rx-590-100415p8gl/p/N82E16814202333?Item=N82E16814202333
    is what I have now. Are you saying that a 1050ti would have been a better choice?
    A 1050ti isn't much cheaper than a RX 590. Or you could have just used the built in HD Graphics 3000 that comes built into your Intel Core i7-2600K. Not only does your i7-2600K have a low power GPU builtin, it also supports H.264 decoding and encoding acceleration (Quicksync). My 8GB RAM computer does not have a dedicated GPU but instead uses the included Intel Graphics on the CPU, and works fine for all tasks including low end gaming.

    Why exactly do you think you need a GPU? Certain higher end video editing can use GPUs for visual effects, making a RX 590 a good purchase or a 1050-1080. Other than that I'm not sure why you think you need a dedicated GPU card.

    Originally Posted by sdsumike619 View Post
    I just got off the phone with G.Skill and told them I wanted to upgrade the memory by adding two 8GB sticks to my existing two 4GB sticks. The guy on the phone said they don't recommend mixing different size memory sticks in the same board... Is there any truth to that? as I was planning to buy this:

    https://www.newegg.com/g-skill-16gb-240-pin-ddr3-sdram/p/N82E16820231568?Item=N82E16820231568 as it's the same brand and specs of my current memory.
    It might not be as good as having all RAM of the same size but running out of RAM is much worse than any minor slow down from mixing RAM sizes.
    Quote Quote  
  18. Member
    Join Date
    Jan 2004
    Location
    United States
    Search Comp PM
    I don't know, a few years back GPU video encoding was all I heard about, so I had a GTX 570 which did a great job for video encoding. Then it self destructed so I researched a bit and came up with this RX 590. I can't figure out how to use the QSV. It's not available, how do you activate to be available for encoding? Don't you need the GPU when video editing for the preview, and transitions, and effects?
    Quote Quote  
  19. Video Restorer lordsmurf's Avatar
    Join Date
    Jun 2003
    Location
    dFAQ.us/lordsmurf
    Search Comp PM
    I'd use 2x 8gb sticks, or 2x 16gb sticks.

    I've always liked less sticks, less stuff to generate heat. Heat is a main concern of mine in recent years. And adding more fans, creating a wind tunnel, doesn't help, heat is still created. I want to NOT create it, not just move it around. Water cooling is equally not acceptable, makes a computer sound like a vibrator.
    Want my help? Ask here! (not via PM!)
    FAQs: Best Blank DiscsBest TBCsBest VCRs for captureRestore VHS
    Quote Quote  
  20. Dinosaur Supervisor KarMa's Avatar
    Join Date
    Jul 2015
    Location
    US
    Search Comp PM
    Originally Posted by sdsumike619 View Post
    I don't know, a few years back GPU video encoding was all I heard about, so I had a GTX 570 which did a great job for video encoding. Then it self destructed so I researched a bit and came up with this RX 590. I can't figure out how to use the QSV. It's not available, how do you activate to be available for encoding?
    The RX 590 is a fine card but if most of what you are aiming for is H.264 encoding and decoding, then the lower end Nvidia/AMD cards tend to use the exact same encoder chips as the higher end cards from the same year. As the encoder chips are separate from the actual GPU cores. So my RX 460 has the same encoding ability as a RX 480, Nvidia does a similar thing.


    Originally Posted by sdsumike619 View Post
    Don't you need the GPU when video editing for the preview, and transitions, and effects?
    Heavily depends on what you are doing, and what NLE you are using. Certain NLE's support Nvidia Cards, others AMD, many support Quicksync, and some support a mixture of the 3. Might be good to start off with saying what program you use as a NLE and where you think your bottleneck is in your workflow.

    Originally Posted by lordsmurf View Post
    I'd use 2x 8gb sticks, or 2x 16gb sticks.

    I've always liked less sticks, less stuff to generate heat. Heat is a main concern of mine in recent years. And adding more fans, creating a wind tunnel, doesn't help, heat is still created. I want to NOT create it, not just move it around. Water cooling is equally not acceptable, makes a computer sound like a vibrator.
    You are looking at 10 watts per stick under a 100% write load, well under 5 watts for idles. There is also a high correlation between RAM total size and wattage, not how many sticks you have.
    Last edited by KarMa; 22nd Jan 2020 at 12:33.
    Quote Quote  
  21. Member
    Join Date
    Jan 2004
    Location
    United States
    Search Comp PM
    I only use Vegas, version 16 now. When I was on version 13, windows 7 at the 570 card was a great combination for my needs. Now having upgraded to v16, windows 10 is recommended, and if I go to 17, it's required. And I need to upgrade to windows 10 because 7 is officially no longer supported, so I'll just add two more 8GB sticks and hope for the best at this point.
    Quote Quote  
  22. Originally Posted by sdsumike619 View Post
    Do you think I made a mistake buying the RX 590? I don't do any gaming, just strictly encoding.
    But I was also told that CPU encodes produce better quality.
    I do have a SAMSUNG 840 EVO SSD for the system drive
    1. In General, price aside,
    https://pc.watch.impress.co.jp/img/pcw/docs/1230/504/graoh_01_l.png
    https://www.pugetsystems.com/labs/articles/Media-Encoder-CC-2018-Transcoding-NVIDIA-Ge...eon-Vega-1205/
    https://www.pugetsystems.com/labs/articles/Premiere-Pro-CC-2019-AMD-Radeon-VII-vs-NVID...orce-RTX-1395/
    https://www.pcgamesn.com/amd-rx-590-review-benchmarks-powercolor-red-devil?amp
    https://m.youtube.com/watch?v=CLqpVImLPGE
    https://helpx.adobe.com/uk/premiere-pro/system-requirements.html#gpu-acceleration

    Not as widespread support vs Nvidia, slower performance at the same price range, slower performance at the top ranks, etc.

    Isn't a "bad" choice if price vs performance was a big factor, but there's always better.

    2. QSV video may not be available if the AMD GPU is active, making the Intel GPU inactive.
    You'll need to enable both.
    https://mirillis.com/intel-quick-sync-setup-action-tutorial

    3. Quality
    https://devblogs.nvidia.com/turing-h264-video-encoding-speed-and-quality/
    Nvidia's latest has tuning down. Excellent performance on their latest cards.

    That said, note the curve of each graph - increasing bit rate always significantly increased picture encode quality regardless of encoder used.

    Ie. Starve qsv, x.264, etc and you'll get so-so encodes.
    As long as file size isn't so important, simply (in handbrake) slide up that quality setting from 20 to 12-15.

    But today, nvidia can produce encodes that can swap for x.264 at the same bitrate without too much concern. (You can always test per encode to determine the best)

    ....

    That said, let's let the broadcast industry evaluate it properly.
    (Ie. It's complex which is "best" because it varies depending on bitrate, video, person watching it)

    https://www.streamingmedia.com/Articles/Editorial/Featured-Articles/Hardware-Based-Tra...tors_selection

    In general, short of selecting super slow x.264 encodes with custom tuning, most of us can get encodes that are visually similar and acceptable with qsv and nvidia encodes as long as absolute smallest file size isn't the primary goal. Just push that quality slider up from 20 towards the 12-15 range in handbrake.
    Quote Quote  
  23. Member
    Join Date
    Jan 2004
    Location
    United States
    Search Comp PM
    Why all these displays?
    I have two connected to the RX 590 and I have a TV connected to the HDMI port on the motherboard.
    Image Attached Thumbnails Click image for larger version

Name:	displays.PNG
Views:	44
Size:	73.2 KB
ID:	51626  

    Quote Quote  
  24. aBigMeanie aedipuss's Avatar
    Join Date
    Oct 2005
    Location
    666th portal
    Search Comp PM
    win7 and hdmi don't play well together. you can try an old fix - "Right-click resolution on home page, select the monitor you want to remove, drop down "multiple display" click disable display -> press apply -> select "multple display" drop down again and now you will be presented with "remove this display" -> apply."
    --
    "a lot of people are better dead" - prisoner KSC2-303
    Quote Quote  
  25. Member
    Join Date
    Jan 2004
    Location
    United States
    Search Comp PM
    Thanks, but I'm not even going to deal with it. I'm going to take the plunge and upgrade to Win 10 soon enough
    Quote Quote  
  26. Dinosaur Supervisor KarMa's Avatar
    Join Date
    Jul 2015
    Location
    US
    Search Comp PM
    Looks like Vegas does not support Intel Quicksync. But it does seem to support OpenCL and CUDA.

    Originally Posted by babygdav View Post
    But today, nvidia can produce encodes that can swap for x.264 at the same bitrate without too much concern. (You can always test per encode to determine the best)
    At internet bitrates (sub-10Mbit 1080p) x264 is going to have a very noticeable advantage. But the higher the bitrate the less of a difference there is naturally.
    Last edited by KarMa; 22nd Jan 2020 at 23:45.
    Quote Quote  
  27. Image
    [Attachment 51635 - Click to enlarge]
    Image
    [Attachment 51636 - Click to enlarge]


    If QSV isn't available, enable AMD encoding in Handbrake.
    Otherwise, if you can't enable the Intel QS because of the GPU card, you'll have to pull the GPU card and run only the Intel GPU to access QSV.

    ...

    Vegas Video 17 supports MP4 QSV encoding if you select it by customizing the templates for MP4 encodes.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!