locked
Hyper-V and Storage Spaces Direct RRS feed

  • Question

  • Looking for info on S2D. We're migrating from a ten server setup (all physical, no VM) to a hyper converged S2D solution. I've heard both good (Dell and DataOn) and bad (CDW). Who has real world experience setting up and managing S2D? I think this is Microsoft's 2nd attempt at Storage Spaces, have they got it right yet? 

    Thanks for any help.


    Greg

    Friday, June 15, 2018 11:52 PM

Answers

  • Well, there are literally thousands of customers running S2D in production, so it is pretty obvious that lots of companies are relying on it.  https://blogs.technet.microsoft.com/filecab/2018/03/27/storage-spaces-direct-momentum/

    That blog post is already three months old.  So if they reached 10,000 in 18 months of availability, it is very possible that in the last three months they have added over 1,600 more.

    To try to be fair to CDW, yes, there have been issues.  There are also many installations that did not use certified solutions.  My hunch is that many (if not most) of the issues came from un-certified sites. I have seen many reports were people try to 'roll their own' solution and end up going through different hardware components until they find the ones that work.  This is the same thing the vendors went through in order to put together a certified solution.  So if you want to avoid all the testing that the vendors have already performed to ensure a workable, stable solution, start with a certified solution.


    tim

    • Marked as answer by Greg4230 Monday, June 18, 2018 1:23 PM
    Monday, June 18, 2018 1:04 PM

All replies

  • Hi!

    I have set it up on HPE hardware and it's been working without any issues really!

    From Storage Spaces, I would say S2D is far better in my experience at least, I believe Microsoft have improved a lot from their past.

    Best regards,
    Leon


    Blog: https://thesystemcenterblog.com  LinkedIn:   

    Saturday, June 16, 2018 11:35 AM
  • https://www.microsoft.com/en-us/cloud-platform/software-defined-datacenter provides a link to tested solutions provided by vendors.  I can understand your statement about CDW because CDW is not a listed vendor.  It is wise to start with existing vendors who have certified their systems for WSSD.

    I wouldn't say that this S2D is "Microsoft's 2nd attempt at Storage Spaces".  It is a new implementation that builds upon the initial release.  Since you do not identify what issues you had with Storage Spaces, it is difficult to offer any guidance.


    tim

    Sunday, June 17, 2018 11:28 AM
  • We haven't used either. This will be our first use of S2D or Storage Spaces. Based on what the CDW engineer stated, we'd spend a lot of time fixing issues, etc. I find hard to beleive and if it were true, Dell, DataOn, Lenovo, etc. wouldn't offer it as a solution. 

    Looking for other users experience, and any pitfalls to look for / avoid. 


    Greg

    Sunday, June 17, 2018 2:49 PM
  • Well, there are literally thousands of customers running S2D in production, so it is pretty obvious that lots of companies are relying on it.  https://blogs.technet.microsoft.com/filecab/2018/03/27/storage-spaces-direct-momentum/

    That blog post is already three months old.  So if they reached 10,000 in 18 months of availability, it is very possible that in the last three months they have added over 1,600 more.

    To try to be fair to CDW, yes, there have been issues.  There are also many installations that did not use certified solutions.  My hunch is that many (if not most) of the issues came from un-certified sites. I have seen many reports were people try to 'roll their own' solution and end up going through different hardware components until they find the ones that work.  This is the same thing the vendors went through in order to put together a certified solution.  So if you want to avoid all the testing that the vendors have already performed to ensure a workable, stable solution, start with a certified solution.


    tim

    • Marked as answer by Greg4230 Monday, June 18, 2018 1:23 PM
    Monday, June 18, 2018 1:04 PM