Share via

Azure Migrate appliance Discovery Service Crashing

Christopher Lewis 41 Reputation points
2026-04-15T15:15:46.4033333+00:00

I'm running a migrate appliance for physical devices and the Discovery Service always crashes with a .Net error

Appliance services

Service name Installed version Latest version Status
Discovery agent 2.0.3375.721 2.0.03378.727 Stopped

Event Viewer:

Log Name:      Application
Source:        Application Error
Date:          4/15/2026 7:49:04 AM
Event ID:      1000
Task Category: (100)
Level:         Error
Keywords:      Classic
User:          N/A
Computer:      WIN-3HKK35QSF25
Description:
Faulting application name: ServerDiscoveryService.exe, version: 2.0.3375.721, time stamp: 0x696f0000
Faulting module name: unknown, version: 0.0.0.0, time stamp: 0x00000000
Exception code: 0xc0000005
Fault offset: 0x00007ffde098e6b6
Faulting process id: 0x1248
Faulting application start time: 0x01dccce6d79c9ead
Faulting application path: C:\Program Files\Microsoft Azure Server Discovery Service\ServerDiscoveryService.exe
Faulting module path: unknown
Report Id: 94b33896-0104-420a-9b7e-8d37c890677e
Faulting package full name: 
Faulting package-relative application ID: 
Event Xml:
<Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event">
  <System>
    <Provider Name="Application Error" />
    <EventID Qualifiers="0">1000</EventID>
    <Version>0</Version>
    <Level>2</Level>
    <Task>100</Task>
    <Opcode>0</Opcode>
    <Keywords>0x80000000000000</Keywords>
    <TimeCreated SystemTime="2026-04-15T14:49:04.6135820Z" />
    <EventRecordID>8655</EventRecordID>
    <Correlation />
    <Execution ProcessID="0" ThreadID="0" />
    <Channel>Application</Channel>
    <Computer>WIN-3HKK35QSF25</Computer>
    <Security />
  </System>
  <EventData>
    <Data>ServerDiscoveryService.exe</Data>
    <Data>2.0.3375.721</Data>
    <Data>696f0000</Data>
    <Data>unknown</Data>
    <Data>0.0.0.0</Data>
    <Data>00000000</Data>
    <Data>c0000005</Data>
    <Data>00007ffde098e6b6</Data>
    <Data>1248</Data>
    <Data>01dccce6d79c9ead</Data>
    <Data>C:\Program Files\Microsoft Azure Server Discovery Service\ServerDiscoveryService.exe</Data>
    <Data>unknown</Data>
    <Data>94b33896-0104-420a-9b7e-8d37c890677e</Data>
    <Data>
    </Data>
    <Data>
    </Data>
  </EventData>
</Event>

C:\ProgramData\Microsoft Azure\Logs\Server\Discovery\ServiceAria_20260415.log has lots of errors like this:

14:48 Critical:   RefreshSignatureCacheEvent::[SeverityLevel, Critical]:[Level, 16]:[Timestamp, 2026-04-15 14:48:42Z]:[EntityType, Microsoft.Azure.FDS.ServiceContract.Server.ServerSignature]:[Signature, {
  "Id": "192-168-1-220-b2c5822f-d312-46a8-8934-5f947194b895",
  "Fqdn": "192.168.1.220",
  "BIOSGuid": "CCD39883-6368-4E29-B273-802E7D6A13A7",
  "OsType": "WindowsGuest",
  "RunAsAccountId": "af5edc3c-8dfc-595a-807e-a02b527758a3",
  "FabricElementType": "Server",
  "ID": "192-168-1-220-b2c5822f-d312-46a8-8934-5f947194b895",
  "MetaData": {
    "AppsAndRolesDiscoveryPipe": "Unknown",
    "DependencyMapDiscoveryPipe": "Unknown",
    "SQLMetadataDiscoveryPipe": "Unknown",
    "PendingUpdatesDiscoveryPipe": "Unknown",
    "IsDependencyMappingEnabled": false,
    "IsDependencyMappingAutoEnabled": true,
    "Fqdn": null,
    "IpAddresses": null,
    "RunAsAccountId": "af5edc3c-8dfc-595a-807e-a02b527758a3",
    "AppsAndRolesHydratedRunAsAccountId": null,
    "DependencyMapHydratedRunAsAccountId": null,
    "SQLHydratedRunAsAccountId": "",
    "PendingUpdatesHydratedRunAsAccountId": null,
    "GuestOSType": 1,
    "MachineName": "192-168-1-220-b2c5822f-d312-46a8-8934-5f947194b895",
    "MachineId": "192-168-1-220-b2c5822f-d312-46a8-8934-5f947194b895",
    "FabricElementType": "MetaData",
    "IsSIAttempted": false,
    "SIValidationErrors": [
      {
        "ErrorCode": "UnableToConnectToPhysicalServer",
        "ErrorLevel": "Error",
        "ErrorMessageParametersMap": {
          "name": "192-168-1-220-b2c5822f-d312-46a8-8934-5f947194b895",
          "errorCode": "UnableToConnectToPhysicalServer",
          "errorMessage": "Unable to connect to this machine. The error per IP is: \n[IP: 192-168-1-220-b2c5822f-d312-46a8-8934-5f947194b895, Error:Unable to connect to server '192-168-1-220-b2c5822f-d312-46a8-8934-5f947194b895' due to an error. Error code: '-2147024894' Error details: 'Could not load file or assembly 'Microsoft.Management.Infrastructure, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35'. The system cannot find the file specified.'.]\n[IP: 192-168-1-220-b2c5822f-d312-46a8-8934-5f947194b895, Error:Unable to connect to server '192-168-1-220-b2c5822f-d312-46a8-8934-5f947194b895' due to an error. Error code: '-2147024894' Error details: 'Could not load file or assembly 'Microsoft.Management.Infrastructure, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35'. The system cannot find the file specified.'.]\n"
        },
        "RunAsAccountId": "af5edc3c-8dfc-595a-807e-a02b527758a3",
        "DiscoveryScope": "AppsAndRoles",
        "Id": "",
        "Message": "",
        "SummaryMessage": "",
        "PossibleCause": "",
        "RecommendedAction": "",
        "UpdatedTimeStamp": "2026-04-15T14:48:42.4476578Z",
        "AdditionalInfo": {}
      }
    ],
    "HasSIValidationError": true,
    "HasDependencyMapValidationError": true,
    "DependencyMapValidationErrors": [
      {
        "ErrorCode": "UnableToConnectToPhysicalServer",
        "ErrorLevel": "Error",
        "ErrorMessageParametersMap": {
          "name": "192-168-1-220-b2c5822f-d312-46a8-8934-5f947194b895",
          "errorCode": "UnableToConnectToPhysicalServer",
          "errorMessage": "Unable to connect to this machine. The error per IP is: \n[IP: 192-168-1-220-b2c5822f-d312-46a8-8934-5f947194b895, Error:Unable to connect to server '192-168-1-220-b2c5822f-d312-46a8-8934-5f947194b895' due to an error. Error code: '-2147024894' Error details: 'Could not load file or assembly 'Microsoft.Management.Infrastructure, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35'. The system cannot find the file specified.'.]\n[IP: 192-168-1-220-b2c5822f-d312-46a8-8934-5f947194b895, Error:Unable to connect to server '192-168-1-220-b2c5822f-d312-46a8-8934-5f947194b895' due to an error. Error code: '-2147024894' Error details: 'Could not load file or assembly 'Microsoft.Management.Infrastructure, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35'. The system cannot find the file specified.'.]\n"
        },
        "RunAsAccountId": "af5edc3c-8dfc-595a-807e-a02b527758a3",
        "DiscoveryScope": "DependencyMap",
        "Id": "",
        "Message": "",
        "SummaryMessage": "",
        "PossibleCause": "",
        "RecommendedAction": "",
        "UpdatedTimeStamp": "2026-04-15T14:48:42.4476578Z",
        "AdditionalInfo": {}
      }
    ]
  }
}]

and


14:49 Error:      Properties::        [CommandSource, Internal]:[operation, RefreshShallowDiscovery]:[VirtualMachineUuid, 192-168-1-220-b2c5822f-d312-46a8-8934-5f947194b895]:[PartnerType, MySQL]:[AgentSessionId, 6c07e3a6-9b8c-4b50-b11c-4474c01af406]:[version, 2.0.3375.721]:[fabrictype, Server]:[agenttype, ServerDiscovery]:[agentid, 48168512-860b-4ca1-8092-a2d5486bf39c-agent]:[ServiceEndPoint, discoverysrv.cus.prod.migration.windowsazure.com]:[ip, 192.168.1.226]:[threadname, WorkerThread]:[activityid, b5437918-b8c9-4ae7-8cfd-7ded2ebbeece]:[LogGenerationtime, 2026-04-15 14:49:04Z]
14:49 Error:      Messages::          [SeverityLevel, Error]:[Level, 8]:[Timestamp, 2026-04-15 14:49:04Z]:[Raw, Exception while creating secure PS Session on Port:5986 with machine 192-168-1-220-b2c5822f-d312-46a8-8934-5f947194b895.]

and

14:49 Error:      Properties::        [CommandSource, Internal]:[operation, RefreshShallowDiscovery]:[VirtualMachineUuid, 192-168-1-220-b2c5822f-d312-46a8-8934-5f947194b895]:[PartnerType, MySQL]:[AgentSessionId, 6c07e3a6-9b8c-4b50-b11c-4474c01af406]:[version, 2.0.3375.721]:[fabrictype, Server]:[agenttype, ServerDiscovery]:[agentid, 48168512-860b-4ca1-8092-a2d5486bf39c-agent]:[ServiceEndPoint, discoverysrv.cus.prod.migration.windowsazure.com]:[ip, 192.168.1.226]:[threadname, WorkerThread]:[activityid, b5437918-b8c9-4ae7-8cfd-7ded2ebbeece]:[LogGenerationtime, 2026-04-15 14:49:04Z]
14:49 Error:      Exceptions::        [SeverityLevel, Error]:[Level, 8]:[Timestamp, 2026-04-15 14:49:04Z]:[Message, CustomMessage: Exception while creating secure PS Session on Port:5986 with machine 192-168-1-220-b2c5822f-d312-46a8-8934-5f947194b895. || ExceptionType: System.IO.FileNotFoundException || ExceptionMessage: Could not load file or assembly 'Microsoft.Management.Infrastructure, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35'. The system cannot find the file specified.]:[ExceptionType, System.IO.FileNotFoundException]:[Source, System.Management.Automation]:[StackTrace,    at System.Management.Automation.CimClassDeserializationCache`1..ctor()
   at System.Management.Automation.DeserializationContext..ctor(DeserializationOptions options, PSRemotingCryptoHelper cryptoHelper)
   at System.Management.Automation.Remoting.Fragmentor..ctor(Int32 fragmentSize, PSRemotingCryptoHelper cryptoHelper)
   at System.Management.Automation.Remoting.BaseTransportManager..ctor(PSRemotingCryptoHelper cryptoHelper)
   at System.Management.Automation.Remoting.Client.BaseClientTransportManager..ctor(Guid runspaceId, PSRemotingCryptoHelper cryptoHelper)
   at System.Management.Automation.Remoting.Client.BaseClientSessionTransportManager..ctor(Guid runspaceId, PSRemotingCryptoHelper cryptoHelper)
   at System.Management.Automation.Remoting.Client.WSManClientSessionTransportManager..ctor(Guid runspacePoolInstanceId, WSManConnectionInfo connectionInfo, PSRemotingCryptoHelper cryptoHelper, String sessionName)
   at System.Management.Automation.Runspaces.WSManConnectionInfo.CreateClientSessionTransportManager(Guid instanceId, String sessionName, PSRemotingCryptoHelper cryptoHelper)
   at System.Management.Automation.Remoting.ClientRemoteSessionDSHandlerImpl..ctor(ClientRemoteSession session, PSRemotingCryptoHelper cryptoHelper, RunspaceConnectionInfo connectionInfo, URIDirectionReported uriRedirectionHandler)
   at System.Management.Automation.Remoting.ClientRemoteSessionImpl..ctor(RemoteRunspacePoolInternal rsPool, URIDirectionReported uriRedirectionHandler)
   at System.Management.Automation.Internal.ClientRunspacePoolDataStructureHandler.CreateClientRemoteSession(RemoteRunspacePoolInternal rsPoolInternal)
   at System.Management.Automation.Internal.ClientRunspacePoolDataStructureHandler..ctor(RemoteRunspacePoolInternal clientRunspacePool, TypeTable typeTable)
   at System.Management.Automation.Runspaces.Internal.RemoteRunspacePoolInternal.CreateDSHandler(TypeTable typeTable)
   at System.Management.Automation.Runspaces.Internal.RemoteRunspacePoolInternal..ctor(Int32 minRunspaces, Int32 maxRunspaces, TypeTable typeTable, PSHost host, PSPrimitiveDictionary applicationArguments, RunspaceConnectionInfo connectionInfo, String name)
   at System.Management.Automation.Runspaces.RunspacePool..ctor(Int32 minRunspaces, Int32 maxRunspaces, TypeTable typeTable, PSHost host, PSPrimitiveDictionary applicationArguments, RunspaceConnectionInfo connectionInfo, String name)
   at System.Management.Automation.RemoteRunspace..ctor(TypeTable typeTable, RunspaceConnectionInfo connectionInfo, PSHost host, PSPrimitiveDictionary applicationArguments, String name, Int32 id)
   at System.Management.Automation.Runspaces.RunspaceFactory.CreateRunspace(RunspaceConnectionInfo connectionInfo, PSHost host, TypeTable typeTable, PSPrimitiveDictionary applicationArguments, String name)
   at System.Management.Automation.Runspaces.RunspaceFactory.CreateRunspace(RunspaceConnectionInfo connectionInfo, PSHost host, TypeTable typeTable)
   at System.Management.Automation.Runspaces.RunspaceFactory.CreateRunspace(PSHost host, RunspaceConnectionInfo connectionInfo)
   at System.Management.Automation.Runspaces.RunspaceFactory.CreateRunspace(RunspaceConnectionInfo connectionInfo)
   at Microsoft.AzureMigrate.Appliance.PowerShellClients.PowerShellClient.InitializeRunspaceAsync(Boolean useSsl, Int32 port, CancellationToken ct)
   at Microsoft.AzureMigrate.Appliance.PowerShellClients.PowerShellClient.CreatePsSessionAsync(CancellationToken ct)]:[HelpLink, ]:[HResult, -2147024894]

Testing PSRemote to the two machines I'm testing works perfectly fine.

Azure Migrate
Azure Migrate

A central hub of Azure cloud migration services and tools to discover, assess, and migrate workloads to the cloud.

0 comments No comments

Answer accepted by question author

  1. Siva shunmugam Nadessin 9,625 Reputation points Microsoft External Staff Moderator
    2026-04-15T16:15:31.4666667+00:00

    Hello Christopher Lewis ,

    Thank you for reaching out to the Microsoft Q&A forum. 

    When investigated your physical-server discovery service is crashing because it can’t load the Microsoft.Management.Infrastructure assembly (the CIM/WinRM DLL that PowerShell uses to talk to Windows hosts). That assembly ships as part of Windows Management Framework (WMF), so if it’s missing or mismatched on your appliance, the Discovery agent will fault out with the exact errors you’re seeing.

    Here’s what you can try:

    1. Verify appliance OS & WMF version • Make sure your Azure Migrate appliance is running Windows Server 2016 (or later) and has WMF 5.1 (or newer) installed. • From an elevated PowerShell prompt on the appliance, run
    2. Get-Module -ListAvailable Microsoft.Management.Infrastructure

    You should see the CIM cmdlets module. If it’s missing, install or repair WMF 5.1 (https://docs.microsoft.com/powershell/wmf/5.1/install-configure).

    1. Check for missing DLL in GAC • On the appliance, look for Microsoft.Management.Infrastructure.dll under C:\Windows\Microsoft.NET\Assembly\GAC_MSIL. • If it’s not there, repairing your .NET Framework or installing WMF will drop it in place.
    2. Update the Discovery agent service • Your logs show you’re on 2.0.3375.721 but the latest is 2.0.3378.727. • In the appliance config manager, go to Auto-update → Manually update an older version and point it at the new package. • Or re-run the installer script for physical appliance discovery (see “manually update an older version of appliance services” in the docs).
    3. Confirm WinRM/HTTPS prerequisites on target servers • You need WinRM over port 5986 with a valid server-auth certificate (CN matches hostname, not expired or self-signed) on each physical machine you’re discovering. • From the appliance, test a CIM session to the server:
    4. Test-CimSession -ComputerName 192.168.1.220 -UseSsl

    • If that fails, fix your certificate or WinRM listener on the target.

    1. Restart services & re-run discovery • After the above, restart the Microsoft Azure Server Discovery Service (ServerDiscoveryService.exe) from Services.msc. • Watch the Event Viewer and C:\ProgramData\Microsoft Azure\Logs\Server\Discovery\ServiceAria_*.log for errors.

    If it still crashes, please share:

    • Appliance OS version + WMF/Powershell build • Whether auto-update is enabled or disabled • A snippet of Get-Service -Name am* so we can see all the appliance services and their statuses • Confirmation that Microsoft.Management.Infrastructure.dll exists on the appliance (and its version)

    References

    • Configure WinRM for HTTPS (prerequisite for physical discovery): https://docs.microsoft.com/azure/migrate/how-to-set-up-appliance-physical#verify-prerequisites

    • Manually update an older version of appliance services: https://docs.microsoft.com/azure/migrate/migrate-appliance#manually-update-an-older-version

    • Appliance services overview: https://docs.microsoft.com/azure/migrate/migrate-appliance#appliance-services

    2 people found this answer helpful.

1 additional answer

Sort by: Most helpful
  1. Q&A Assist
    2026-04-15T15:16:14.1366667+00:00
    AI generated content. This question contains an answer created with AI from Q&A Assist. Learn more

    The appliance logs show repeated UnableToConnectToPhysicalServer and SSHOperationTimeout-type conditions, which align with Azure Migrate’s documented connectivity and timeout errors. These connectivity failures won’t by themselves explain the .NET crash, but they must be fixed first; otherwise the discovery service will keep failing operations and may remain unstable.

    Based on the documented behavior, use this sequence:

    1. Fix UnableToConnectToPhysicalServer (60001) The error text in your log matches the documented 60001: UnableToConnectToPhysicalServer condition for physical servers. For each affected physical server (for example 192-168-1-220-…):
      • Ensure the server meets all Azure Migrate physical server prerequisites and port access requirements.
      • On the Azure Migrate appliance, add the IP addresses of the discovered servers to the WinRM TrustedHosts list and retry. This is required to allow remote inbound connections:
        • Windows source servers: WinRM port 5985 (HTTP)
        • Linux source servers: SSH port 22 (TCP)
      • Confirm the correct authentication method is selected on the appliance for these servers.
      • If, after these checks, the error persists, open a Microsoft support case and include the appliance machine ID from the configuration manager footer.
    2. Fix credential-related issues (60002, 10004, 10005, 10012) The log shows failed attempts to create secure PowerShell sessions. This commonly maps to the credential errors documented for discovery and dependency analysis:
      • 60002: InvalidServerCredentials – verify the credentials configured on the appliance can log on to the source servers directly. If they were changed or expired, edit them on the appliance and revalidate.
      • 10004 / 10012: Credentials not provided / incorrect domain – ensure credentials exist for the OS type of each server and that domain credentials are only used for domain-joined machines with a correct domain name.
      • 10005: GuestCredentialNotValid – if this appears in the portal, follow the remediation steps in the referenced article to correct or replace the credentials.
    3. Validate WMI/PowerShell access (Windows sources) The stack trace shows failures in System.Management.Automation when creating remote runspaces, which depends on WMI and PowerShell remoting:
      • On each Windows source server, ensure the account used by the appliance has WMI permissions on the root namespace and subnamespaces. If you see GetWMIObjectAccessDenied (9022), follow the documented steps:
        • Run wmimgmt.msc on the source server.
        • Right‑click WMI Control (Local)PropertiesSecurity.
        • Edit permissions on ROOT, add the appliance credential account, grant execute permissions, and apply to “This namespace and subnamespaces”.
      • After adjusting WMI permissions, verify the fix using the mitigation verification steps referenced in the documentation.
    4. Check for timeouts and OS patch level (60005) If any Linux or SSH-based discovery is involved, the pattern also matches 60005: SSHOperationTimeout:
      • Ensure impacted servers have the latest kernel and OS updates installed.
      • Place the appliance and source servers on the same domain or low-latency network segment.
      • From the appliance, connect to the impacted server and run the commands documented in the appliance troubleshooting guide to confirm they return valid (non-null) data.
      • If timeouts persist, open a Microsoft support case with the appliance machine ID.
    5. Restart the discovery service from the portal Once connectivity, credentials, and WMI/PowerShell access are corrected:
      • In the Azure portal, go to Azure Migrate projectDiscovery and assessmentOverviewManageAppliances.
      • Select the affected appliance and choose Refresh services to restart the discovery service.
      • Monitor the appliance services page to confirm the Discovery agent moves to a healthy, running state.
    6. When to escalate the .NET crash If, after the above remediation, the ServerDiscoveryService.exe process still crashes with 0xc0000005 and the same Microsoft.Management.Infrastructure FileNotFoundException in the appliance logs, this indicates an internal error in the appliance binaries rather than a pure connectivity issue. At that point:
      • Collect the appliance logs (including ServiceAria_*.log and Windows Application logs around Event ID 1000).
      • Open a Microsoft support case, referencing the recurring 60001/SSHOperationTimeout and the Microsoft.Management.Infrastructure load failure, and provide the appliance machine ID.

    References:

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.