Jump to content

What GPU and CPU is ideal for Penetration testing role job?


Srazario
 Share

Recommended Posts

I want to build a workstation to polish my Pen test Skills for my Security Analyst job interviews and want to have something powerful to do the all Security related work. so I want to know that what GPU and CPU is better for such Cracking passwords and hashes plus can also handle serious workload?. Are professional GPUs better for hacking sort of work or the Gaming GPUs?. I also want to know which CPU is better to go for Xeon or i7 extreme?.

Thanks

Link to comment
Share on other sites

Regarding CPU, not much is needed.

I suppose whatever you consider mid-level is overkill, but you probably need some CPU for Microsoft Office suite so you can run your administrative overhead decently.

Regarding GPU, if you have the right tools, the BEST GPU you can fit and afford will server you well.  The GPU can be used to do parallel programming and parallel computing which means you can utilize those 32 cores for your password cracking, giving you much more power than what CPUs are currently out there.

https://en.wikipedia.org/wiki/General-purpose_computing_on_graphics_processing_units

Edited by IDNeon
Link to comment
Share on other sites

You just have to be careful about support. Advice swings between nvidia and AMD and some devices aren't supported or don't work as well as expected.

Not always the highest priced or highest gaming spec is best.

Link to comment
Share on other sites

6 minutes ago, digininja said:

You just have to be careful about support. Advice swings between nvidia and AMD and some devices aren't supported or don't work as well as expected.

Not always the highest priced or highest gaming spec is best.

Yeah I suppose but I figure you're already stepping out of scope for their actual applications, it's better to seek what is common in a scientific computing setting, just ask vendor what they recommend for large multivariable statistical analysis.

All of this is academic, I think in pentesting proving you CAN brute force, is more important than ACTUALLY brute forcing.

Whether or not it takes your system 100 million years versus the latest and greatest super computer is irrelevant.

You don't need to brute force anything to prove the vulnerability exists.

For instance do you have a lock-out policy on your domain accounts for OWA logins?  No?  Then that CAN be brute forced.

Edited by IDNeon
Link to comment
Share on other sites

Depends on the situation, I'm on a test at the moment where I popped a box and managed to pull the local SAM. I brute forced that to get the password for an account that was reused across the network. This is a test being done quietly with only a couple of people in the security team aware it is going on, going to them to ask for the password isn't in the spirit of the test so I have to get it myself.

Don't mix offline and online brute forcing, a lock out policy has no affect on offline brute forcing using GPUs and GPUs don't help with going for an OWA login.

Link to comment
Share on other sites

Use oclHashCat with a decent GPU.

CPUs are no longer keeping up with the computational power of GPUs when it comes to password cracking.

Generally speaking, any GPU that is powerful for gaming will also be decent for processes hashes, so go for a high-end gaming GPU and oclHashCat.

Speaking from experience.

Link to comment
Share on other sites

Again, no. Being good a gaming is not necessarily being good at password cracking. They can be the same thing but not always and you don't want to spend a fortune on a card that doesn't work.

Go to the Hashcat site and check their lists.

Link to comment
Share on other sites

On ‎28‎/‎03‎/‎2017 at 8:21 AM, digininja said:

Again, no. Being good a gaming is not necessarily being good at password cracking.

Hence 'Generally speaking'.

Also, I disagree. Modern gaming means that the graphics cards designed for this have to complete a lot of mathematical computations every second. Which is exactly what you want in a password cracking rig as well.

 

You're also incorrect regards the site listing this.

Quotes:

"What is the best GPU for hashcat for "total speed", "speed by watt" and "speed by price"?

Due to the nature of this subject changing continuously there is no definitive answer."

"Is there a general benchmark table for all GPU?

ATM, there is no such table."

 

There are 2 lists linked, but they are quoted as being "outdated" - https://hashcat.net/wiki/doku.php?id=frequently_asked_questions#what_is_the_best_gpu_for_hashcat_for_total_speed_speed_by_watt_and_speed_by_price

Link to comment
Share on other sites

I agree that there are some graphics cards out there that will squeeze a little more power in to computation, as used in science labs for example.

However, generally speaking, the guy-at-home password cracking rigs are pretty much just gaming graphics cards, because they do the job perfectly fine. With the added bonus of being good for playing games.

Link to comment
Share on other sites

6 minutes ago, haze1434 said:

Also, I disagree. Modern gaming means that the graphics cards designed for this have to complete a lot of mathematical computations every second. Which is exactly what you want in a password cracking rig as well.

 

I know this is way out of date but have a look at this thread from the Hashcat forum, it describes an Nvidia card which is better for gaming but is worse than the AMD equivalent for password cracking.

https://hashcat.net/forum/thread-2181.html

The same still holds true today, both types of cards are designed to do the same things but they do them in different ways, some work better for games, some for cracking.

Link to comment
Share on other sites

How many parallel processes a GPU can do at the same time and how fast it can do it, is better for cracking, even when it may suck for gaming. It may not have all the 3d shaders or physics capabilities of other cards that are better for gaming, 3d modeling, but processing power is also not always a better gaming experience, yet may be better for cracking due to driver support and number of math crunches it can do. In most general cases, a CUDA based card is a good investment for Linux based systems that have driver support, just make sure it's on the hashcat site as both compatible with your OS and in the range you want for cracking to money value. AMD's cards in many cases, while not always best for gaming, have been known to be better math crunchers and outperforming the CUDA cards for less money, so pick your poison, but do the research on everything from your OS, CPU, and Mobo combination that will work best with your GPU. As digininja has mentioned, check with the hashcat crowd for reviews, comparability and real world working examples of their setups. 

Link to comment
Share on other sites

No. It is ones that support the features required by the tools that are to be  used. I don't know what the definition of a "Professional GPU" is but if it isn't supported by the cracking tool or the OS or the motherboard then it isn't going to be any use.

Check the tool you want to use and then go to its site or forum and see what they recommend. That may be an amazing gaming GPU, it may be a dirt cheap "Professional" GPU that no one has ever heard of, it may be a pair of two cheap ones that do better than one single one.

Link to comment
Share on other sites

"any thoughts?"

 

Yes, ask them if it worked well and if it was easy to set up. If it was, check if it's compatible with your rig, if it is then you've found a card that will work. Log the specs and the price.

Repeat the process with other recommendations from people with working cards then when you've got a few to compare, buy the best you can afford.

What I'm trying to do is to save you from buying something that is recommended by people who don't really know what they are talking about or who don't have up to date knowledge of what is out there. Get to the Hashcat forums and ask on there, you'll get a much better set of replies from people who are actually using this kit.

Link to comment
Share on other sites

2 hours ago, Srazario said:

One person on Hashcat suggested Nvidia GTX 1080 Ti ?. Any Thoughts ?

The GTX 1080 Ti is one of the newer cards, and also more expensive too. This is also why matching bang for your buck, with what fits in your rig itself matters. If your mobo doesn't make use of the full PCI-e cards features, makes no sense to buy it if it's going to rate limit to older hardware specs too. This is why the research must be done for the main features and budget needs, and comparing GPU's with different mobos too, as some boards work better on different hardware with different CPU setups and PCI-e bus lane capabilities. I'v had cards in the past, that didn't even fit in machines due to the size of the mobo components and case, so unless you're building from scratch, these are all things to think about with respect to your purchase. Especially when opened electronics, unless damaged or DOA, aren't returnable in most cases, only exchangeable for the same component.

 

Example: https://www.trentonsystems.com/industry-applications/pci-express-interface

 

edit. I should note so no confusion, you can use a pcie 3.0 card in a 2.0 slot(and I do this with my rig, fine for gaming), but you're wasting your money not getting the full 3.0 + 3.0 combined performance, and will be limited to 2.0 mobo speeds. IF you buy a 3.0 card, and only have a 2.0 slot, you're probably not going to get the full or most potential from the card, so don't waste your money if this is a rig doing specific tasks you bought it for.

Edited by digip
Link to comment
Share on other sites

Not sure why no one has pointed this out but there's a whole market for this exact thing in the GPU industry which is why you find better performing GPUs for this task that are not as good for gaming.

I am sure GPU manufacturers actually have a sales team devoted to explaining what's best for this.

Link to comment
Share on other sites

On 3/27/2017 at 1:01 PM, digininja said:

Depends on the situation, I'm on a test at the moment where I popped a box and managed to pull the local SAM. I brute forced that to get the password for an account that was reused across the network. This is a test being done quietly with only a couple of people in the security team aware it is going on, going to them to ask for the password isn't in the spirit of the test so I have to get it myself.

Don't mix offline and online brute forcing, a lock out policy has no affect on offline brute forcing using GPUs and GPUs don't help with going for an OWA login.

Well to clarify your statement a little bit.

The only reason GPUs don't help for "OWA" is because of other limiting factors like how fast you send attempts at the OWA, etc.

All of it still depends upon speed, it's just what's bottlenecking you and reduce that.  I'm sure there's a laundry list of optimizations for OWA/firewall account cracking where accounts don't have lockout policies and etc.

Link to comment
Share on other sites

5 hours ago, IDNeon said:

Well to clarify your statement a little bit.

The only reason GPUs don't help for "OWA" is because of other limiting factors like how fast you send attempts at the OWA, etc.

All of it still depends upon speed, it's just what's bottlenecking you and reduce that.  I'm sure there's a laundry list of optimizations for OWA/firewall account cracking where accounts don't have lockout policies and etc.

Maybe I missed something, but isn't OWA the web interface for office mail logins? Hashcat cracks passwords against encrypted strings/hashes, not against live website logins. That would be Hydra or other such tools for login brute forcing, not password brute forcing against hashes.

Link to comment
Share on other sites

Just now, digip said:

Maybe I missed something, but isn't OWA the web interface for office mail logins? Hashcat cracks passwords against encrypted strings/hashes, not against live website logins. That would be Hydra or other such tools for login brute forcing, not password brute forcing against hashes.

I may have glossed over what you were getting at, been a busy day, and thought you were referring to the time frame in which that task could be performed. 

Link to comment
Share on other sites

On 3/30/2017 at 4:59 PM, digip said:

The GTX 1080 Ti is one of the newer cards, and also more expensive too. This is also why matching bang for your buck, with what fits in your rig itself matters. If your mobo doesn't make use of the full PCI-e cards features, makes no sense to buy it if it's going to rate limit to older hardware specs too. This is why the research must be done for the main features and budget needs, and comparing GPU's with different mobos too, as some boards work better on different hardware with different CPU setups and PCI-e bus lane capabilities. I'v had cards in the past, that didn't even fit in machines due to the size of the mobo components and case, so unless you're building from scratch, these are all things to think about with respect to your purchase. Especially when opened electronics, unless damaged or DOA, aren't returnable in most cases, only exchangeable for the same component.

 

Example: https://www.trentonsystems.com/industry-applications/pci-express-interface

 

edit. I should note so no confusion, you can use a pcie 3.0 card in a 2.0 slot(and I do this with my rig, fine for gaming), but you're wasting your money not getting the full 3.0 + 3.0 combined performance, and will be limited to 2.0 mobo speeds. IF you buy a 3.0 card, and only have a 2.0 slot, you're probably not going to get the full or most potential from the card, so don't waste your money if this is a rig doing specific tasks you bought it for.

I'm building my PC from scratch. so you saying PCI-e 3.0 is probably better choice?. I'm not too sure if the TX 1080 Ti supports PCI-e 3.0 tho. 

Link to comment
Share on other sites

1 hour ago, Srazario said:

I'm building my PC from scratch. so you saying PCI-e 3.0 is probably better choice?. I'm not too sure if the TX 1080 Ti supports PCI-e 3.0 tho. 

It's a PCI Express 3.0 x16 card.

It will work on a PCI 2.0 board, but not at full capabilities and will rate limit to the 2.0 bus speeds. With gaming, generally not a huge issue unless a high end game. It will play fine but not be able to max out settings to Ultra for example, without bottlenecking the system and can actually cause the machine to lag. 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...