none
Hyper-V 2012 R2 - Firmware only showing "File - BootMgrFW.efi" or "No Boot Entries" if new

    Question

  • This has only just started to showing today

    I created a new VM on one of my hosts to test ConfigMgr OSD Deployment, only for it to fail to boot

    When I checked the config of the VM I see this: 

    On existing VMs on the same host, I only see the Bootmgfw.efi option - no CD Drive or NICs even though they were there a few days ago.

    There have been no new updates other than SCEP definitions and no changes to the server

    My other Hyper-V host is fine with no issues but worried it'll suddenly start.

    OS: Windows Server 2012 R2 Ent with Hyper-V role installed (obviously!) 32 GB RAM
    Server has been running around 3 months

    Anyone got any ideas?


    Regards Craig Wilson

    Wednesday, August 31, 2016 5:45 PM

Answers

  • Hi Craig, did you or someone run the infamous mofcomp WindowsVirtualization.V2.mof on any of your hyper-v servers to make them manageable by Windows 10?

    The symptoms you describe align perfectly with what happens if you run that command on a Hyper-V server (the way it breaks Gen2).

    If you or someone did run that command, then i'm afraid the damage has already been done and the only way to repair is to uninstall the Hyper-V role and reinstall.

    • Marked as answer by _Craig Wilson_ Friday, September 2, 2016 8:54 AM
    Wednesday, August 31, 2016 10:04 PM

All replies

  • How was the VM created?

    I have been able to do things like this programmatically, but not through the UI nor PowerShell cmdlets.

    It almost looks malformed, as if someone simply changed the type from Gen1 to Gen2 without creating the Generation2 specific devices.


    Brian Ehlert
    http://ITProctology.blogspot.com
    Learn. Apply. Repeat.

    Wednesday, August 31, 2016 7:05 PM
    Moderator
  • Created via the Hyper-V Management Console

    Speaking of malformed Vms.. could it be a result of using Win10's updated hyper-v management console to create the VM on the server?

    This wasn't an issue pre-1607 and another server is still OK as I've not created any new VMs via Win10

    (Hope that makes some sense!)


    Regards Craig Wilson

    Wednesday, August 31, 2016 7:18 PM
  • Considering the number of problems that I have experienced with the anniversary update first hand, nothing will surprise me when it is involved.

    Especially when updating.  For a milestone release, it has a number of issues.

    If your Hyper-V Server has a shell, try its local Hyper-V Management console to create the VM.  That would narrow the version issue down.


    Brian Ehlert
    http://ITProctology.blogspot.com
    Learn. Apply. Repeat.

    Wednesday, August 31, 2016 7:44 PM
    Moderator
  • Both HV servers have a shell, but same problem when I log in locally

    The other server has 1 VM that is showing the same issue, but new ones I create are (so far) fine


    Regards Craig Wilson

    Wednesday, August 31, 2016 8:22 PM
  • Hi Craig, did you or someone run the infamous mofcomp WindowsVirtualization.V2.mof on any of your hyper-v servers to make them manageable by Windows 10?

    The symptoms you describe align perfectly with what happens if you run that command on a Hyper-V server (the way it breaks Gen2).

    If you or someone did run that command, then i'm afraid the damage has already been done and the only way to repair is to uninstall the Hyper-V role and reinstall.

    • Marked as answer by _Craig Wilson_ Friday, September 2, 2016 8:54 AM
    Wednesday, August 31, 2016 10:04 PM
  • Hi Craig,

    >>could it be a result of using Win10's updated hyper-v management console to create the VM on the server?

    >>The other server has 1 VM that is showing the same issue, but new ones I create are (so far) fine

    Is it happening only on the VMs created remotely by Win 10 with update installed?

    If yes, observe the locally created VM to see if the issue happen again.

    Best Regards,

    Leo


    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Support, contact tnmff@microsoft.com.

    Thursday, September 1, 2016 2:29 AM
    Moderator
  • Hi Craig, did you or someone run the infamous mofcomp WindowsVirtualization.V2.mof on any of your hyper-v servers to make them manageable by Windows 10?

    The symptoms you describe align perfectly with what happens if you run that command on a Hyper-V server (the way it breaks Gen2).

    If you or someone did run that command, then i'm afraid the damage has already been done and the only way to repair is to uninstall the Hyper-V role and reinstall.

     Not come across this, but now you mention it - there was a WMI repair done recently and I suspect this has been done on both servers

    I'll investigate further and look at a role re-install as soon as I can get some downtime


    Regards Craig Wilson


    Thursday, September 1, 2016 6:12 AM
  • Hi Craig,

    >>I'll investigate further and look at a role re-install as soon as I can get some downtime

    Welcome to share the information if you got any updates on it.

    Best Regards,

    Leo


    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Support, contact tnmff@microsoft.com.

    Friday, September 2, 2016 1:56 AM
    Moderator
  • Thanks for this James - Reinstalled the role on one of the servers and it is now working correctly again!

    This is was a real head scratcher for a bit but this worked a treat

    Thanks!


    Regards Craig Wilson

    Friday, September 2, 2016 8:56 AM
  • I just ran into this exact problem. I had an issue with Windows 2016 Hyper-v Management Console managing an instance of Hyper-V 2016 on a remote system. I ran the mofcomp WindowsVirtualization.V2.mof and that broke my ability to deploy Gen 2 VMs using VMM 2016. The below steps corrected the issue for me.

    1) Put a VM Host in maintenance mode and migrate all VMs to a different node in the cluster (assuming you have a cluster, if not just shutdown all VMs)

    2) On the VM in maintenance mode launch PowerShell run the following commands:

      a) cd c:\windows\system32

      b) mofcomp .\WindowsVirtualizationUninstall.mof

      c) mofcomp .\WindowsVirtualization.V2.mof

      d) Restart-Service VMMS

      e) cd "C:\Program Files\Microsoft System Center 2016\Virtual Machine Manager\setup"

      f) mofcomp .\scvmmswitchportsettings.mof

      g) Restart-Service SCVMMAgent

      h) Restart VM Host

    Repeat this step for all broken nodes in the cluster.

    Tuesday, May 29, 2018 2:02 PM