-
Posts
748 -
Joined
-
Last visited
-
Days Won
31
Content Type
Profiles
Forums
Enhancement Requests
Posts posted by Steve G
-
-
Hi @Ieuan Payne IPO,
As Gerry has mentioned, we already provide a few DevOps integrations via the iBridge, based around Releases and Work Items, but we could certainly add more. What are the specific DevOps features you are wanting to automate, is it just to run an existing pipeline or are there other requirements?
Thanks,
Steve
-
Hi @AndyGilly,
This error would suggest that the value provided to the SiteID input parameter isn't a valid, or accessible, ECM site. The operation basically uses the Set-Location cmdlet to set the execution location to the ECM site drive using the provided site ID, then uses the Add-CMDeviceCollectionDirectMembershipRule cmdlet (alongside a couple of other bits) to add the supplied device to the collection. The SiteID will be case-sensitive too, so it may be worth checking this first.
Cheers,
Steve
- 1
-
Hi @Martyn Houghton,
You could use the data::entityBrowseRecords2 API, pointing at the Contact entity in the com.hornbill.core application, for example:
<methodCall service="data" method="entityBrowseRecords2"> <params> <application>com.hornbill.core</application> <entity>Contact</entity> <searchFilter> <column>h_email_1</column> <value>some.email@address.com</value> <matchType>exact</matchType> </searchFilter> <searchFilter> <column>h_contact_status</column> <value>0</value> <matchType>exact</matchType> </searchFilter> </params> </methodCall>
Would return the contact wiith the defined email address in the h_email_1 column, with a status of 0 (active).
Cheers,
Steve
- 1
-
Hi @Ann,
You could always access the CSV file using the Database Asset Relationship Import tool via an ODBC data source, and write the query as appropriate to your data. See the DBConf details on the wiki page you've linked above for more information:
Thanks,
Steve
-
Hi @JoanneG,
The errors are rights-based, so the user who owns the API key you are using doesn't have the correct roles to perform the operations, specifically this one:
If you apply this role and try again, then this should be fine.
Cheers,
Steve
- 1
-
Hi @samwoo,
Yes, this will likely require a fair amount of work to achieve - I expect we'll need to cache all asset and extended information records from Hornbill into memory tool-side, then check each of the assets being imported against that cache. It's been on the to-do list to add this feature for a while though... I'll take a more detailed look in the new year and get back to you.
Cheers,
Steve
- 1
-
Hi @Jeremy,
Apologies for the late response on this.
There's a utility node to escape strings ready to be inserted into JSON, this is what you need to use on the field in question before using the result of this node in your JSON:
So without pushing the description through this node, and adding it in its vanilla state to the JSON, your output is (as you have seen):
But preparing the string with the workflow utility before adding it to your JSON, and you get the desired result:
Let me know if this helps.
Cheers,
Steve
- 1
-
Hi @JoanneG,
The decoding error is due to this erroneous comma in your config:
Removing this comma would mean that just Change Request CH00006679 is deleted.
Cheers,
Steve
- 1
-
Hi @samwoo,
I didn't notice the Create/Update/Both request when doing the PreserveState etc feature, so I've added that into the tool, v1.12.0 contains that feature
https://github.com/hornbill/goDBAssetImport/releases/tag/v1.12.0
Check out the OperationType property of the AssetTypes objects, you can set them to Both, Create or Update. Leaving the parameter blank (or indeed not even including it) will perform the previous logic (so Both ).
Cheers,
Steve
- 1
- 1
-
Hi @samwoo,
I've just released v1.11.0 of the DB Asset Import Tool, that supports preserving Status/SubStatus/Operational Status values for updating assets. Can be downloaded from here:
https://github.com/hornbill/goDBAssetImport/releases/latest
Check out the new PreserveState, PreserveSubState and PreserveOperationalState options in the AssetTypes definitions.
Setting the Status to Active when assets are inserted is alredy possible, you just need to add:
"h_record_state": "1",
To the AssetGenericFieldMapping.
Cheers,
Steve
- 1
-
Hi @Frank Reay,
As assets are audited, this can be done with a report against the audit table. I've attached an example report definition, which can be imported into your Service Manager reporting in the admin console.
Cheers,
-
Hi @AndyGilly,
That's done, give it 5 mins and you should see the change on your instance.
Note, rather than using the Could Automations to interact with Service Manager on your instance, you could use the Hornbill Automations to log new requests - which means you no longer need to provide an API Key in keysafe:
Cheers,
Steve
-
Hi @Michael Risby,
This was indeed a defect, which we've just fixed and released. Try it again on your instance in 5 mins or so and you should get the requested content type passed to your external endpoint. Let me know how you get on.
Cheers,
Steve
-
No worries @AndyGilly. Nope, the process is all automatic, it can just take a few mins for the fix to be delivered. The next time you open your workflow/runbook designer, it should just work.
Cheers,
Steve
-
Hi @AndyGilly,
Apologies for any inconvenience caused, but this issue is now fixed and will be available on your instance in the next few minutes.
Cheers,
Steve
-
Hi @m.vandun,
This message just means that the report is already being run on your Hornbill instance, either from the admin tool or a script that has executed it. You can check the running status of a report in the admin console - it's on the Status column in the Report History tab. You (or your automated scripts) will need to wait until the current report run is complete before firing off another request.
Cheers,
Steve
-
-
Hi @JamieMews,
This is a known issue with reporting in the platform, which has been fixed and is awaiting release. As a workaround, we've updated the R scripts (and the Python ones too) so that they now support the use of XLSX report output. Please see the following post for details of the R updates:
Kind regards,
Steve
-
@Jeremy Strange, can you PM me a copy of your workflow so I can take a look please?
-
@Jeremy The easiest way to check if this is an issue with your endpoint or Hornbill sending the requests would be to setup your own RequestBin (I just used a free public one in the above test), point your HTTP Request node at that, leaving the headers and body as they are and just replace the URL, then check to see if the request is received when your workflow sends it. This should prove that the node and workflow are working, and that there may be an issue with your endpoint.
Thanks,
Steve
-
-
Hi @StephC,
I've just released version 1.6.0 of the Hornbill Data Export Tool, which supports the use of the XLSX files from reports, as well as the existing CSV file support. It can be downloaded from Github, and is documented on the Hornbill Wiki. Specifically, look for UseXLSX in the Configuration section, and make sure you switch XLSX on in the report config > Output Formats > Additional Data Formats:
Cheers,
Steve
-
Hi @Joyce,
That's likely due to there being a UTF8 character stream in the report output that sits outside of the Unicode codepage - Power BI doesn't play nicely with all UTF8 characters (unlike pretty much every other system!). I've added some extra code into the script to do the conversion to allow Power BI to import the data, v1.8.0 is on Github now.
Note: as per the instructions, you will need to change the csvEncoding variable to fix this issue, a value of ISO-8859-1 will probably do it.
Cheers,
Steve
-
For anyone who is experiencing this issue, and are using the R scripts to pull data from Hornbill Reporting and push into Power BI, please see the attached post for an immediate workaround.
Thanks,
Steve
SCCM add to device collection error
in IT Operations Management
Posted
Hi @AndyGilly,
This is not WW environment specific, Windows servers restrict script execution by default. We've provided a Set Execution Policy operation in the Windows Management package that can be executed either side of the operation you wish to run so that the local machine or current user policy doesn't need to be left in a permanently relaxed state. So in a runbook for example, you could use the Set Execution Policy operation to relax the script execution policy for the current user, then execute your actual operation, then use Set Execution Policy to restrict it again.
I expect the AD package operations are ok as the target machine must already have a less-restrictive script execution policy in place, either in the local machine or current user policies.
Cheers,
Steve