Giuseppe Iannacone Posted January 10, 2019 Posted January 10, 2019 we are experiencing issues with the asset management. using the hornbill tool for SCCM import we are able to retrieve data from the DB but when written on Hornbill they are all empty: here is the json configuration file (i've used XXXXX to the sensitive data) { "APIKey": "XXXXX", "InstanceId": "XXXXX", "LogSizeBytes": 1000000, "SQLConf": { "Driver": "odbc", "Server": "", "Database": "Asset_Hornbill", "Authentication": "", "UserName": "XXXXX", "Password": "XXXXXXXX", "Port": 0, "Encrypt": false, "Query": "SELECT OARSys.ResourceID AS [AssetID], OARSys.User_Name0 AS [UserName], OARSys.Netbios_Name0 AS [MachineName], OARSys.Resource_Domain_OR_Workgr0 AS [NETDomain], dbo.v_GS_OPERATING_SYSTEM.Caption0 AS [OperatingSystemCaption], OARSys.Operating_System_Name_and0 AS [OperatingSystem], dbo.v_GS_OPERATING_SYSTEM.Version0 AS [OperatingSystemVersion], dbo.v_GS_OPERATING_SYSTEM.CSDVersion0 AS [ServicePackVersion], dbo.v_GS_COMPUTER_SYSTEM.Manufacturer0 AS [SystemManufacturer], dbo.v_GS_COMPUTER_SYSTEM.Model0 AS [SystemModel], dbo.v_GS_PC_BIOS.SerialNumber0 AS [SystemSerialNumber], OAProc.MaxClockSpeed0 AS [ProcessorSpeedGHz], OAProc.Name0 AS [ProcessorName], dbo.v_GS_COMPUTER_SYSTEM.NumberOfProcessors0 AS [NumberofProcessors], dbo.v_GS_X86_PC_MEMORY.TotalPhysicalMemory0 AS [MemoryKB], dbo.v_GS_LOGICAL_DISK.Size0 AS [DiskSpaceMB], dbo.v_GS_LOGICAL_DISK.FreeSpace0 AS [FreeDiskSpaceMB], OAIP.IP_Addresses0 AS [IPAddress], OAMac.MAC_Addresses0 AS [MACAddress], dbo.v_GS_PC_BIOS.Description0 AS [BIOSDescription], dbo.v_GS_PC_BIOS.ReleaseDate0 AS [BIOSReleaseDate], dbo.v_GS_PC_BIOS.SMBIOSBIOSVersion0 AS [SMBIOSVersion], dbo.v_GS_SYSTEM.SystemRole0 AS [SystemType], OASysEncl.ChassisTypes0 AS [ChassisTypes], OASysEncl.TimeStamp AS [ChassisDate], OARSys.AD_Site_Name0 AS [SiteName] FROM dbo.v_R_System OUTER APPLY (SELECT TOP 1 * FROM dbo.v_R_System b WHERE b.Netbios_Name0 = dbo.v_R_System.Netbios_Name0 ORDER BY SMS_UUID_Change_Date0 DESC) OARSys OUTER APPLY (SELECT TOP 1 dbo.v_GS_SYSTEM_ENCLOSURE.* FROM dbo.v_GS_SYSTEM_ENCLOSURE WHERE dbo.v_GS_SYSTEM_ENCLOSURE.ResourceID = dbo.v_R_System.ResourceID ORDER BY TimeStamp DESC) OASysEncl OUTER APPLY (SELECT TOP 1 IP_Addresses0, ROW_NUMBER() OVER (order by (SELECT 0)) AS rowNum FROM dbo.v_RA_System_IPAddresses WHERE dbo.v_RA_System_IPAddresses.ResourceID = dbo.v_R_System.ResourceID ORDER BY rowNum DESC) OAIP OUTER APPLY (SELECT TOP 1 MAC_Addresses0 FROM dbo.v_RA_System_MACAddresses WHERE dbo.v_RA_System_MACAddresses.ResourceID = dbo.v_R_System.ResourceID ) OAMac OUTER APPLY (SELECT TOP 1 MaxClockSpeed0, Name0 FROM dbo.v_GS_PROCESSOR WHERE dbo.v_GS_PROCESSOR.ResourceID = dbo.v_R_System.ResourceID ORDER BY TimeStamp DESC) OAProc LEFT JOIN dbo.v_GS_X86_PC_MEMORY ON dbo.v_GS_X86_PC_MEMORY.ResourceID = dbo.v_R_System.ResourceID LEFT JOIN dbo.v_GS_OPERATING_SYSTEM ON dbo.v_GS_OPERATING_SYSTEM.ResourceID = dbo.v_R_System.ResourceID LEFT JOIN dbo.v_GS_COMPUTER_SYSTEM ON dbo.v_GS_COMPUTER_SYSTEM.ResourceID = dbo.v_R_System.ResourceID LEFT JOIN dbo.v_GS_PC_BIOS ON dbo.v_GS_PC_BIOS.ResourceID = dbo.v_R_System.ResourceID LEFT JOIN dbo.v_GS_LOGICAL_DISK ON dbo.v_GS_LOGICAL_DISK.ResourceID = dbo.v_R_System.ResourceID LEFT JOIN dbo.v_FullCollectionMembership ON (dbo.v_FullCollectionMembership.ResourceID = v_R_System.ResourceID) LEFT JOIN dbo.v_GS_SYSTEM ON dbo.v_GS_SYSTEM.ResourceID = dbo.v_R_System.ResourceID WHERE dbo.v_GS_LOGICAL_DISK.DeviceID0 = 'C:' AND dbo.v_FullCollectionMembership.CollectionID = 'SMS00001' " }, "AssetTypes": [{ "AssetType": "Server", "Query": "AND OASysEncl.ChassisTypes0 IN (2, 17, 18, 19, 20, 21, 22, 23)", "AssetIdentifier": { "DBColumn": "SystemSerialNumber", "Entity": "AssetsComputer", "EntityColumn": "h_serial_number" } } ], "AssetGenericFieldMapping": { "h_name": "[MachineName]", "h_site": "[SiteName]", "h_asset_tag": "[MachineName]", "h_acq_method": "", "h_actual_retired_date": "", "h_beneficiary": "", "h_building": "", "h_cost": "", "h_cost_center": "", "h_country": "", "h_created_date": "", "h_deprec_method": "", "h_deprec_start": "", "h_description": "[MachineName] ([SystemModel])", "h_disposal_price": "", "h_disposal_reason": "", "h_floor": "", "h_geo_location": "", "h_invoice_number": "", "h_location": "", "h_location_type": "", "h_maintenance_cost": "", "h_maintenance_ref": "", "h_notes": "", "h_operational_state": "", "h_order_date": "", "h_order_number": "", "h_owned_by": "[UserName]", "h_product_id": "", "h_received_date": "", "h_residual_value": "", "h_room": "", "h_scheduled_retire_date": "", "h_supplier_id": "", "h_supported_by": "", "h_used_by": "[UserName]", "h_version": "", "h_warranty_expires": "", "h_warranty_start": "" }, "AssetTypeFieldMapping": { "h_name": "[MachineName]", "h_mac_address": "[MACAddress]", "h_net_ip_address": "[IPAddress]", "h_net_computer_name": "[MachineName]", "h_net_win_domain": "[NETDomain]", "h_model": "[SystemModel]", "h_manufacturer": "[SystemManufacturer]", "h_cpu_info": "[ProcessorName]", "h_description": "[SystemModel]", "h_last_logged_on": "", "h_last_logged_on_user": "", "h_memory_info": "[MemoryKB]", "h_net_win_dom_role": "", "h_optical_drive": "", "h_os_description": "[OperatingSystem]", "h_os_registered_to": "", "h_os_serial_number": "", "h_os_service_pack": "[ServicePackVersion]", "h_os_type": "", "h_os_version": "[OperatingSystemVersion]", "h_physical_disk_size": "[DiskSpaceMB]", "h_serial_number": "[SystemSerialNumber]", "h_cpu_clock_speed": "[ProcessorSpeedGHz]", "h_physical_cpus": "[NumberofProcessors]", "h_logical_cpus": "", "h_bios_name": "[BIOSDescription]", "h_bios_manufacturer": "", "h_bios_serial_number": "", "h_bios_release_date": "[BIOSReleaseDate]", "h_bios_version": "[SMBIOSVersion]", "h_max_memory_capacity": "", "h_number_memory_slots": "", "h_net_name": "", "h_subnet_mask": "" } }
James Ainsworth Posted January 10, 2019 Posted January 10, 2019 Hi @Giuseppe Iannacone I may have to let someone with a bit more knowledge have a look at that, but in the meantime, are you able to check what version you are on? There was a release at the end of last week. Being on the latest version will also help with troubleshooting and making sure that it is not an issue related to an older version. Regards, James
Giuseppe Iannacone Posted January 10, 2019 Author Posted January 10, 2019 @James Ainsworth thank you James it was the first thing I've checked after posting, and i confirm is it working now. If i understand correctly the tool will validate the data based on the h_name field, but this may create duplicate entries if a pc is renamed into the domain, or am I wrong? is there a way to validate the data based on the pc serialnumber/servicetag? thank you by the way for the reply.
James Ainsworth Posted January 11, 2019 Posted January 11, 2019 Hi @Giuseppe Iannacone Great to hear that this is working for you now. I will have a look to see if I can find out more about the validation. I'm afraid that I don't know off the top of my head. Regards, James
James Ainsworth Posted January 11, 2019 Posted January 11, 2019 Hi @Giuseppe Iannacone A bit of a guess here... I'm assuming that this is where you set up the validation. You can execute a "Dry Run" as explained in the documentation here, which should let you test some of this. Hope that helps. James
Giuseppe Iannacone Posted January 11, 2019 Author Posted January 11, 2019 @James Ainsworth is it working with your suggestions (I had to include the "Entity":"AssetsComputer" to be fully working), but my concern is that if i have a duplicate entry into the Asset Management it will update randomly the asset (I'm supposing the one with a lower ID number into the Hornibill db); I know that with the upload function you might have a validation rule, but an agent may always create a duplicate entry. Is there a setting that deny a duplicate entry based on a criteria? (for example the serialnumber)
Steve G Posted January 14, 2019 Posted January 14, 2019 Hi @Giuseppe Iannacone, The example that @James Ainsworth has provided above will prevent duplicates based on a matching serial number - if an identical serial number is found, then the asset details will be updated if any of the fields do not match their existing values. Are you looking for functionality in the tool to not allow updates if a match is found? Or for functionality in the UI to prevent analysts from creating assets with identical serial numbers? Thanks, Steve
Giuseppe Iannacone Posted January 14, 2019 Author Posted January 14, 2019 @Steve G sorry my request was not so clear. Is it clear to me that the tool will work fine with this criteria and if the serial number will be matched the asset will be updated. this is fine. I would like to have a validation criteria (like the upload csv tool has got) in the asset management generally speaking, this is to prevent an agent to create twice or more the same asset. Infact I was able to manually create a new asset with the same Serial Number and the tool updated the asset with the smallest ID (or at least this is my impression), so my request is referring to the admin side is there a setting that prevents a duplicate entry into the asset list ?
Steve G Posted January 14, 2019 Posted January 14, 2019 HI @Giuseppe Iannacone, Ah ok, I see what you mean now There's no setting for this currently, but I'll feed this enhancement request into our development team. Cheers, Steve
Giuseppe Iannacone Posted February 5, 2019 Author Posted February 5, 2019 @Steve G and @James Ainsworth it seems like that even with the serial number validation we have duplicate entries, do you know why this is happening? we are using this sintax and the client "goDBAssetImport_v1_7_2"
Steve G Posted February 6, 2019 Posted February 6, 2019 Hi @Giuseppe Iannacone, Can you send me a copy of the entire config JSON please (with the instance name, API key, and any DB authentication details removed), and I'll take a look. Thanks, Steve
Giuseppe Iannacone Posted February 14, 2019 Author Posted February 14, 2019 can anyone support us? even with the suggestion provided by @Steve G we have duplicates entry (with the validation criteria set on h_serial_number)
Victor Posted February 14, 2019 Posted February 14, 2019 @Giuseppe Iannacone - I understand from @Steve G that you confirmed yesterday morning that duplicates no longer occur. Is this no longer the case? Are you getting duplicates again since yesterday morning?
Steve G Posted February 14, 2019 Posted February 14, 2019 @Giuseppe Iannacone Just seen your follow-up message re: more duplicates. The tool is working as expected - I've tried again and I can't replicate this with our SCCM data and the config that I've provided, so I can only assume that this must be an issue with either the data (in SCCM or from your imports last month, looking at those dates) or how it's returned by the SQL query? From the table above, if you delete the duplicate records with the earliest "Date Created" value, and re-run the import, are those records again duplicated? Thanks, Steve
Giuseppe Iannacone Posted February 14, 2019 Author Posted February 14, 2019 @Victor and @Steve G it seems like the duplicate are randomly created. Please consider that: 1. the sync tool is running each hour 2. the duplicate item seems to be from time to time a different one (it seems to me a random failure on the validation criteria) i will provide more detail once i will compare the duplicate item, and i will send in private the log file to @Steve G
Giuseppe Iannacone Posted February 14, 2019 Author Posted February 14, 2019 @Victor and @Steve G I confirm that the duplicate are new one once removed, and at least since 3 days ago till now.
Giuseppe Iannacone Posted February 14, 2019 Author Posted February 14, 2019 @Victor and @Steve G I've found an error message into the log file for a specific duplicate: 2019/02/14 10:34:53 [ERROR] API Call failed when searching instance for existing Asset:Post https://mdh-p01-api.hornbill.com/datalogic/xmlmc//data/?method=entityBrowseRecords2: dial tcp 78.129.173.117:443: connectex: A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond. 2019/02/14 10:34:53 [DEBUG] API Call XML: <params><application>com.hornbill.servicemanager</application><entity>AssetsComputer</entity><searchFilter><column>h_serial_number</column><value>1DDN8H2</value><matchType>exact</matchType></searchFilter><maxResults>1</maxResults></params> 2019/02/14 10:34:53 [DEBUG] Create Asset: 1DDN8H2 2019/02/14 10:34:53 Unable to Search for Group: The element <entity> was not expected at location '/methodCall/params/entity' 2019/02/14 10:34:53 [DEBUG] API XML: <params><application>com.hornbill.servicemanager</application><entity>AssetsComputer</entity><searchFilter><column>h_serial_number</column><value>1DDN8H2</value><matchType>exact</matchType></searchFilter><maxResults>1</maxResults><application>com.hornbill.core</application><queryName>GetGroupByName</queryName><queryParams><h_name><nil></h_name><h_type>5</h_type></queryParams></params>
Steve G Posted February 14, 2019 Posted February 14, 2019 Hi @Giuseppe Iannacone , That's great, thanks. According to the log, you're seeing intermittent connectivity issues between the PC running the import utility and your Hornbill instance. When this happens during the API call to check if the asset already exists, the asset ID is not returned (as expected - as the tool essentially couldn't find a match), so the asset is created rather than updated. This is why I wasn't able to recreate your issue from this end Are you aware of connectivity issues at your site? @Victor FYI. I could make some modifications to the tool so that if the search for an existing asset fails, a new asset won't be created. Would that be appropriate in this instance? It would mean that if the asset didn't exist, and you saw one of these network issues, then the asset wouldn't be imported until the next run of the tool. Thanks, Steve
Giuseppe Iannacone Posted February 14, 2019 Author Posted February 14, 2019 @Steve G yes please avoid creation of a new asset if the api call fails As far as i know there are no connectivity issues and looking at the log after 1 second the tool is able to go ahead, or am I wrong?
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now