Jump to content

Steve G

Hornbill Developer
  • Content count

    65
  • Joined

  • Last visited

  • Days Won

    7

Steve G last won the day on August 1 2017

Steve G had the most liked content!

Community Reputation

21 Excellent

About Steve G

  • Rank
    Advanced Member

Profile Information

  • Gender
    Male
  • Location
    Hull, UK

Contact Methods

  • Website URL
    http://www.hornbill.com

Recent Profile Visitors

753 profile views
  1. Hi @Dan Munns, That's done, and released to Github, with details & release download there and on the wiki page: https://github.com/hornbill/goSnowCloudAssetImport/ https://wiki.hornbill.com/index.php/Snow_License_Manager_Cloud_Asset_Import I've added the ability to switch on/off the adding or updating of discovered assets. Let me know how you get on with this. Cheers, Steve
  2. Hi @Dan Munns, Sorry I thought I'd already replied to this. The asset importer will update any assets it finds where the value contained within the field defined in the AssetID object in the config matches an asset value in Hornill, where the Hornbill table column is defined by the AssetIdentifier object in the config. So where the value returned by Asset.Name here: Matches a value in the main Hornbill assets table column h_name, as defined here: Which, if the AssetGenericFieldMapping is configured as so, should find matches on the second+ run of the import tool: So as long as the tool finds a match, and the operational or record state of the record (or indeed any of the other mapped columns) contains a different value, then the record will be updated. If no match is found, then a new asset record is created. Kind regards, Steve
  3. Hi @Dan Munns, Apologies for the late response, I've been on leave Great to see that you're using the tool and it's working for you! Yes this can be done in the tool, by setting the value of one (or both) of the following fields in the AssetGenericFieldMapping section of the config JSON, but this means you will need one JSON config file defined per record state being imported as the values will need to be hard coded in the JSON: h_record_state : this is the State field from the asset record, and will take the following integer values: 0 - Current 1 - Active 2 - Archived h_operational_state : this is the Operational State field from the asset record, and will take the following integer values: 0 - Operational 1 - Pre-Production 2 - Retired So if you include a state clause in your asset type filters to only return assets with a specific status (Status eq 'Active' for example)to match the hard-coded values set in h_record_state and/or h_operational_state, then you should see the correct statuses against the imported assets. In this example, all assets imported/updated by the tool using this config file would have a state of Archived and an operational state of Retired. I'll add support for record & operational status mapping in to the tool when I get chance, so you can roll these up in to the same config. Will let you know once that's available. Kind regards, Steve
  4. Hi @Dan Munns, As per the email I've just sent, a 404 error here generally points to an incorrect instance name being used. Could you check the instance name in the conf.json please, ensuring that it's correct, and in the correct case? Thanks, Steve
  5. Hi @Oscar Stankard, We are in the process of making a number of enhancements to how the request search works, but in the meantime did you know about the Ctrl+Shift+F shortcut? Hitting this button combination from anywhere within Hornbill will present you with a Quick Search box, allowing you to quickly get to a Request that you know the reference for. Just type or paste in a Request ID, and click Open Request to be taken straight to it Kind regards, Steve
  6. Steve G

    Hornbill CSV Asset Import

    Hi @samwoo, I've had a look at the CSV asset importer, and defined the h_location_type column to be populated using the locationType mapped column from a dummy CSV file, and the location type is populated correctly. So from the config file: And the CSV & resulting asset once the import has ran: I've also tried hard-coding the h_location_type value in to the config JSON, but still the field is populated as expected: So I'm not sure why your assets are not populated with the location type I'm afraid. If you want to PM me a snapshot of your CSV, and a copy of your import config, then I'll take a look and see if I can spot what's different for you. Kind regards, Steve
  7. Steve G

    Hornbill CSV Asset Import

    Hi @samwoo, The latest release of the CSV importer (v1.1.0), released on 04/12/2017, does include the executables, and is accessible from here: https://github.com/hornbill/goCSVAssetImport/releases/latest With regards to the Location Type, I'll have a look and see what's going on there as this field should be independent from the Location field. I'll let you know when I've found what's causing this. Kind regards, Steve
  8. Steve G

    Export List Defect

    @Dan Munns, I've replicated and fixed this issue, it'll be in the next Service Manager update. Kind regards, Steve
  9. Steve G

    Updating Existing Assets

    @dwalby, Yes, if the imports are configured correctly then existing asset records will be updated rather than duplicates created. Kind regards, Steve
  10. Steve G

    Updating Existing Assets

    @dwalby, As @Victor has suggested, you could do this with a report. We do have a story planned to add export functionality to the assets lists, so I'll add you as an interested customer to the story. Kind regards, Steve
  11. Steve G

    Updating Existing Assets

    Hi @dwalby @DeadMeatGF @DC-BEN, Yes, the CSV Asset Import tool can import new or update existing asset records from a CSV, and can be scheduled. Please see the wiki & github pages for more information: https://wiki.hornbill.com/index.php/CSV_Asset_Import https://github.com/hornbill/goCSVAssetImport Kind regards, Steve
  12. Hi @samwoo,The approach that you've used isn't secure, as you're essentially sharing an API key that can run data::sqlQuery. Anyone who is capable could use that API key for malicious purposes. If you've already pushed the script out with that access-all-areas API key contained within it, then you need to delete the API key from the Hornbill user that it belongs to! With regards to the script itself, I suggest you do not use data::sqlQuery at all, and use the following APIs in your code instead. Then create a custom role in Hornbill that only has permission to the relevant areas, assign this role to a user account for the integration, and generate an API key against this user instead. The user & role should be locked-down as much as possible, so that the API key cannot be used for anything else. So the first thing you need to do is identify the primary key of the asset you want to update. You can do this with data::entityBrowseRecords2 (https://api.hornbill.com/data/?op=entityBrowseRecords2 ), and the XMLMC API call would look something like: <methodCall service="data" method="entityBrowseRecords2"> <params> <application>com.hornbill.servicemanager</application> <entity>AssetsComputer</entity> <matchScope>all</matchScope> <searchFilter> <column>h_name</column> <value>TheAssetName</value> <matchType>exact</matchType> </searchFilter> </params> </methodCall> This will return the asset record, including its primary key. Once you have the primary key for your asset, you can then use data::entityUpdateRecord (https://api.hornbill.com/data/?op=entityUpdateRecord ) to update the values you need. The API call for this would be structured as so: <methodCall service="data" method="entityUpdateRecord"> <params> <application>com.hornbill.servicemanager</application> <entity>AssetsComputer</entity> <primaryEntityData> <record> <h_pk_asset_id>The asset primary key (integer)</h_pk_asset_id> <h_last_logged_on>The Date/Time value</h_last_logged_on> <h_last_logged_on_user>The User URN</h_last_logged_on_user> </record> </primaryEntityData> </params> </methodCall> This is much safer, and more secure, than using data::sqlQuery. I hope this helps,Steve
  13. Steve G

    Implications of Changing the Admin Password

    Hi @Claire Holtham, The newer versions of the asset import tools use an API key to authenticate the Hornbill session, rather than a username and password, so you'd be better off creating an API key against the account that performs the imports, then using this in the configuration file of the latest version of the import tool. More information, and links to download the latest versions, can be found on the wiki page relevant to your import type: https://wiki.hornbill.com/index.php/Database_Asset_Import https://wiki.hornbill.com/index.php/CSV_Asset_Import Kind regards, Steve
  14. Steve G

    Enable to connect with Power BI

    Hi @Joyce, Unfortunately not, as there is no option in Excel to use R scripts as a data source. Kind regards, Steve
  15. Steve G

    Enable to connect with Power BI

    @m.vandun @Joyce @Victor, The bad news - this is actually due to a defect (regression) in Power BI, in that these data source scripts no longer work for larger reports when they used to run fine. The actual problem is when we call RCurl::getURL within a loop in the R script. Once the report has been started, we enter a while-loop, waiting 1 second between loops before running the reportRunGetStatus API to check to see if the report has completed. After the first loop, if the report is not at a completed status as yet, then we loop through to get the report status after waiting another second. At this second run of getURL, Power BI causes getURL to hang indefinitely, until either Power BI times out (30 minutes) or you click the Cancel button. Once you've clicked cancel or the timeout has occurred, the R engine does actually continue to run the script in the background, and gets the data from Hornbill, but Power BI is no longer expecting the data returned so can do nothing with the response. If you were to run the same script directly against your R engine of choice, outside of Power BI, then the script does work as expected. The good news - we do have a workaround until Microsoft fix this defect in Power BI, albeit not a very graceful one. You need to alter the number of seconds passed to suspendExec within the loop to be greater than the time it usually takes for the report to run, to allow the report to complete before running reportRunGetStatus for the first time. So change row 82 in the script highlighted below: You can find the average time taken to run the report in the reporting section of the admin tool for your Hornbill instance, highlighted in green below: Just add a few extra seconds on to the average process time when changing the value being passed to suspendExec, and the script won't need to run getURL more than once in the loop. Feel free to comment-out the whole while(reportComplete == FALSE){ line, and matching } character too should you wish, to completely remove the looping (although this is not really necessary). I hope this helps, Steve
×