Jump to content

Steve G

Hornbill Developer
  • Posts

    748
  • Joined

  • Last visited

  • Days Won

    31

Posts posted by Steve G

  1. Hi @samwoo,

    The approach that you've used isn't secure, as you're essentially sharing an API key that can run data::sqlQuery. Anyone who is capable could use that API key for malicious purposes. If you've already pushed the script out with that access-all-areas API key contained within it, then you need to delete the API key from the Hornbill user that it belongs to!

    With regards to the script itself, I suggest you do not use data::sqlQuery at all, and use the following APIs in your code instead. Then create a custom role in Hornbill that only has permission to the relevant areas, assign this role to a user account for the integration, and generate an API key against this user instead. The user & role should be locked-down as much as possible, so that the API key cannot be used for anything else. 
    So the first thing you need to do is identify the primary key of the asset you want to update. You can do this with data::entityBrowseRecords2 (https://api.hornbill.com/data/?op=entityBrowseRecords2 ), and the XMLMC API call would look something like:

    <methodCall service="data" method="entityBrowseRecords2">
        <params>
            <application>com.hornbill.servicemanager</application>
            <entity>AssetsComputer</entity>
            <matchScope>all</matchScope>
            <searchFilter>
                <column>h_name</column>
                <value>TheAssetName</value>
                <matchType>exact</matchType>
            </searchFilter>
        </params>
    </methodCall>

    This will return the asset record, including its primary key.

    Once you have the primary key for your asset, you can then use data::entityUpdateRecord (https://api.hornbill.com/data/?op=entityUpdateRecord ) to update the values you need. The API call for this would be structured as so:

    <methodCall service="data" method="entityUpdateRecord">
        <params>
            <application>com.hornbill.servicemanager</application>
            <entity>AssetsComputer</entity>
            <primaryEntityData>
                <record>
                    <h_pk_asset_id>The asset primary key (integer)</h_pk_asset_id>
                    <h_last_logged_on>The Date/Time value</h_last_logged_on>
                    <h_last_logged_on_user>The User URN</h_last_logged_on_user>
                </record>
            </primaryEntityData>
        </params>
    </methodCall>

    This is much safer, and more secure, than using data::sqlQuery.

    I hope this helps,
    Steve

  2. Hi @Claire Holtham,

    The newer versions of the asset import tools use an API key to authenticate the Hornbill session, rather than a username and password, so you'd be better off creating an API key against the account that performs the imports, then using this in the configuration file of the latest version of the import tool.

    More information, and links to download the latest versions, can be found on the wiki page relevant to your import type:

    https://wiki.hornbill.com/index.php/Database_Asset_Import

    https://wiki.hornbill.com/index.php/CSV_Asset_Import

    Kind regards,

    Steve

  3. @m.vandun @Joyce @Victor,

    The bad news - this is actually due to a defect (regression) in Power BI, in that these data source scripts no longer work for larger reports when they used to run fine.

    The actual problem is when we call RCurl::getURL within a loop in the R script. Once the report has been started, we enter a while-loop, waiting 1 second between loops before running the reportRunGetStatus API to check to see if the report has completed. After the first loop, if the report is not at a completed status as yet, then we loop through to get the report status after waiting another second. At this second run of getURL, Power BI causes getURL to hang indefinitely, until either Power BI times out (30 minutes) or you click the Cancel button. Once you've clicked cancel or the timeout has occurred, the R engine does actually continue to run the script in the background, and gets the data from Hornbill, but Power BI is no longer expecting the data returned so can do nothing with the response.

    If you were to run the same script directly against your R engine of choice, outside of Power BI, then the script does work as expected.

    The good news - we do have a workaround until Microsoft fix this defect in Power BI, albeit not a very graceful one. You need to alter the number of seconds passed to suspendExec within the loop to be greater than the time it usually takes for the report to run, to allow the report to complete before running reportRunGetStatus for the first time. So change row 82 in the script highlighted below:

    image.png

    You can find the average time taken to run the report in the reporting section of the admin tool for your Hornbill instance, highlighted in green below:

    image.png

     

    Just add a few extra seconds on to the average process time when changing the value being passed to suspendExec, and the script won't need to run getURL more than once in the loop. Feel free to comment-out the whole while(reportComplete == FALSE){ line,  and matching } character too should you wish, to completely remove the looping (although this is not really necessary).

    I hope this helps,

    Steve

     

    • Like 1
  4. Hi @Dan Munns,

    There is actually a setting in Service Manager that enforces the selection of a catalog item within the Progressive Capture Service Details form:

    servicemanager.progressiveCapture.servicedetails.catalogRequired

    Enabling this will force your analysts to select a Catalog Item before being allowed to continue within the Progressive Capture. More details can be found:

    https://wiki.hornbill.com/index.php/Progressive_Capture_Workflow

    Kind regards,

    Steve

  5. Hi @Joyce,

    If your report takes 30 minutes or more to run, then Power BI will timeout the R script. This is a limitation within the Power BI implementation of R, and cannot be changed. See this page for more information:

    https://powerbi.microsoft.com/en-us/documentation/powerbi-desktop-r-scripts/

    Due to this Power BI limitation, we have also provided an R script that will take a Run ID as well as the Report ID as its input parameters, which will then import data from report runs that have already completed. So you could point this at the report that the other R script initiated the run of, to get your data in to Power BI:

    https://github.com/hornbill/rPowerBIHornbillDataSources/blob/master/PowerBIDataSource_HistoricReport.R

    There is a post on the following forum, which describes the issue in more detail. This also allows Power BI users to vote on feature requests, which the Microsoft devs will then use to prioritise what goes in to Power BI. The more votes the better to get Microsoft to fix this odd limitation :)

    https://ideas.powerbi.com/forums/265200-power-bi-ideas/suggestions/17885383-urgently-need-fix-for-r-script-timeout

    I hope this helps,

    Steve

  6. @Martyn Houghton,

    The Service Manager update that contains the supporting code for this is now available in the live app store for updating your instance, and I've released the update to the cleaner tool too, which is on Github now:

    https://github.com/hornbill/goHornbillCleaner

    And I've also updated our wiki page to reflect these changes:

    https://wiki.hornbill.com/index.php/Hornbill_Clean_Utility

    Let me know how you get on with this.

    Cheers,

    Steve

     

  7. @Martyn Houghton,

    No, the conditions can be combined to build more complex filters, or indeed left blank to ignore that particular condition :) So for instance, this setup would delete all Incidents with a cancelled status, that were logged before the 1st of January  2017:

    image.png

    That's fine re: team and logged by filtering. It's too late to get the required supporting code in to the next Service Manager update, but I'll get them in to the one directly after, and produce another cleaner utility build to go with it.

    Cheers,

    Steve

  8. Hi @Martyn Houghton,

    I've made some changes to the cleaner tool, and the supporting code in Service Manager, so that once it's released you will be able to filter requests to be deleted by:

    • Associated Service IDs (multiples of)
    • Request Statuses (multiples of)
    • Request Types (multiples of)
    • Requests logged after a specific date & time
    • Requests logged before a specific date & time

    So an example of how your configuration might look with this new version:

    downloadOriginalImage.php?commentid=urn:

     

    As mentioned, I've had to make some changes within Service Manager to support the changes to the cleaner utility, so once these Service Manager changes have made it to live (they will be in the next update), then I'll release the new version of the utility to our public Github, and post back here to let you know.

    Cheers,

    Steve

    • Like 1
    • Thanks 1
  9. Hi @JBasey,

    As DeadMeatGF has rightly said, you can update assets with the DB asset import tool that we have provided. It can't delete assets from your Hornbill instance, but you can update assets state to "Archived" with it. You just need to add the h_record_state field to the AssetGenericFieldMapping as so:

    "AssetGenericFieldMapping":{
    "h_name":"[MachineName]",
    "h_record_state":"2",
    "h_site":"[SiteName]",
    "h_asset_tag":"[AssetTag]",

    Where 2 is the value that represents an archived asset (0 for Current state, and 1 for Active state).

    So if you can identify your retired assets in the SQL query, and transform whatever status they have in SCCM to a Hornbill asset status in the SQL query, then you could define the status with a mapping rather than hard-coded (with the transformed column output set as AssetState in the SQL query in this mapping example):

    "AssetGenericFieldMapping":{
    "h_name":"[MachineName]",
    "h_record_state":"[AssetState]",
    "h_site":"[SiteName]",
    "h_asset_tag":"[AssetTag]",

    I hope this helps, please let me know if you need any more information regarding this.

    Steve

  10. Hi @samwoo,

    I've added support for proxy authentication to the module, and the code is up on Github now.

    There's a new function that has been made available: Set-Proxy. It's documented within the source code, but you can use this in a couple of ways:

    If you want to use the current Windows user credentials to authenticate against the proxy, then you just need to add the following to your code at any point before you run Invoke-XMLMC:

    Set-Proxy "http://yourproxyaddress:andportifapplicable"

    The Set-Proxy function supports an additional input parameter, so that you can define proxy credentials to authenticate with. This additional parameter can either be just a username in a string (and the script will then ask for the password at this point), or a PSCredential object, such as one created by the Get-Credential cmdlet. So to supply a username and have the script prompt for the password, add the following to your code before the Invoke-XMLMC:

    $User = "Domain01\User01"
    Set-Proxy "http://yourproxyaddress:andportifapplicable" $User

    And if you want to supply a full PSCredential object:

    $User = "Domain01\User01"
    $PWord = ConvertTo-SecureString -String "password" -AsPlainText -Force
    $CredObject = New-Object -TypeName "System.Management.Automation.PSCredential" -ArgumentList $User, $PWord
    Set-Proxy "http://yourproxyaddress:andportifapplicable" $CredObject

    Although, it's probably easier if the account you run this with can just authenticate against your proxy server, and not provide authentication details as a parameter to Set-Proxy, as you shouldn't really be saving plain-text passwords in your Powershell scripts ;)

    Let me know how you get on :)

    Cheers,

    Steve

  11. Hi @samwoo,

    Ah, I see. This is not a response from Hornbill, it's from your proxy. You'll need to add the -Proxy and either one of the -ProxyCredential / -ProxyUseDefaultCredentials parameters to the Invoke-WebRequest command in the module code, as per the MS documentation:

    https://msdn.microsoft.com/en-us/powershell/reference/5.1/microsoft.powershell.utility/invoke-webrequest

    So row 276 in the module source is currently:

    $r = Invoke-WebRequest -Uri $script:URI -Method Post -Headers $script:headers -ContentType "text/xmlmc" -Body $script:body -ErrorAction:Stop

    Add the following to the end of that row:

    -Proxy "yourproxyaddress" -ProxyUseDefaultCredentials

    This should route the request through your proxy, using your logged on credentials as its authentication method.

    If you could let me know how you get on, and if this works then I'll add some additional code in the module to cater for proxy authentication.

    Cheers,

    Steve

    • Thanks 1
  12. Hi @samwoo,

    You probably just need to set the HTTP_PROXY or HTTPS_PROXY environment variables in your operating system. If you're running Windows, they can be set from the command line using the following: 

    set HTTP_PROXY=HOST:PORT

    set HTTPS_PROXY=HOST:PORT  

    Where "HOST" is the IP address or host name of your Proxy Server and "PORT" is the specific port number.

    Kind regards,

    Steve

  13. Hi @samwoo,

    Just to let you know, I released a Hornbill API Powershell module to our public Github yesterday, which should help in allowing you to fire Hornbill API calls from your Powershell scripts. The repository for the module is here:

    https://github.com/hornbill/powershellHornbillAPIModule

    I've included a couple of usage examples - one for returning asset records, and one for returning an asset type then inserting a new asset record. The module functions are documented within the source code, but please let me know if you need any further information.

    Hope this helps!

    Steve

    • Thanks 1
×
×
  • Create New...