Jump to content

h4x0r

Active Members
  • Posts

    35
  • Joined

  • Last visited

Everything posted by h4x0r

  1. Yes, that's very true :-) anyway, I'll be implementing the SAN without using the switch as at the moment I don't have any plan to share the SAN with any other servers apart from the ESXi guest VMs. Thanks for your reply.
  2. To All, After reading through the internet about using iSCSI SAN with ESXi, it seems that i also need to use vSwitch so that the separate SAN subnet can communicate with my client in LAN subnet. so this is the diagram below, please correct me if I'm wrong. Cheers, Albert
  3. great, this means that i can use 2xdirect patch cable connection to the SAN from each server and just leave the production line access one cable here it is the final diagram: URL: http://img245.imageshack.us/img245/2832/iscsisanr.jpg thanks for all of your comments guys. Cheers.
  4. OK, Since my Dell PowerEDGE 2950-III comes with 2x Broadcom Integrated Gigabit Ethernet plus I add the Intel Gigabit Ethernet as additional 2 Gigabit ports. (total of 4 ports per server). I've made colour coding of blue and green for the SAN traffic, however the red line is for the management console access, in this case perhaps I can just remove all of the red line (no dedicated mgmt console.) and just make another pair for guest traffic from the network into the servers ?
  5. VaKo, here it is: or in: http://kimag.es/share/84637437.jpg do you mean by using separate Fibre Optic fabric ? could you please explain or draw it as I'm confused here :-) thanks for replying.
  6. Hi All in response to the previous thread in http://hak5.org/forums/index.php?showtopic...3&st=0& which was confirmed by Decepticon and Cthobs, I'm about to deploy VMWare ESXi 3.5 on 2 servers which will be sharing the SAN using iSCSI (2x Gigabit Ethernet teaming cable). Specs: Dell PowerVault MD3000 10x 300 GB SAS 15k rpm 2x Dual port Gigabit Ethernet NIC (4x in total) Dell PowerEDGE 2950-III 2x Intel Quad Core E5410 32 GB DDR-II 667 MHz internal 4x 500 GB SATA 7200 rpm HDD (RAID 5) - I know it is slow for hosting the VMDKs Internal USB slot on the motherboard (but no USB flashdisk ???) here it is the diagram http://img25.imageshack.us/my.php?image=vmlan.jpg please let me know if this is does make sense and according to the best practice ? and the last thing is, as I've got spare 1 TB from the internal RAID10 SATA Drive, any idea of what should i do with them apart from installing 32 MB ESXi ? thanks.
  7. thanks for the reply guys, it is all clear now to use ESXi with iSCSI. sorry for the late reply due to the easter holiday ;)
  8. Decepticon, Your suggestion and explanation is very great. I'm planning to use 2x Gigabit Ethernet per server to access the SAN Drive (15k rpm SAS drive). the plan is that to create separate networking subnet between the Servers-SAN-GigabitSwitch-AnotherGigabitSwitch.
  9. is there any significant benefits in using iSCSI (2x1GB Ethernet) vs. Fibre Channel apart from prices ?
  10. Hi All, I'm in the process of getting new Virtualization Technology working in my office using: 2x Dell PowerEDGE 2950-III each installed with iSCSI Controller accessing the following SAN 1x the Dell PowerVault MD3000 (10x 300 GB 15k rpm SAS) The VMWare ESXi is installed in an internal USB to load 4x VMs which is running SOLARIS ( serving as Project, homes, SAMBA file server and Source code repository and compiling the project source code as build server too.) 5x Windows Server 2003 which performing as Application Server running Apache Tomcat. And I wonder if there is any performance benefits in implementing the Shared SAN for those two physical Server through iSCSI as opposed with Fibre Channel. I am aware that FC is faster and more expensive but in this case I won't run any VM with DB server on it. I'm looking to get 2x DUAL port Gigabit Ethernet so that each server can have 2 GB of Bandwidth into it shall i go down in the path of having iSCSI or stick with the FC considering the SAN is running on 15k rpm SAS used by 2 ESXi server. Please share some thought in regards to this configuration. Thanks,
×
×
  • Create New...