Jump to content

Steve G

Hornbill Developer
  • Posts

    748
  • Joined

  • Last visited

  • Days Won

    31

Posts posted by Steve G

  1. Hi @Joanne,

    The "unsupported protocol scheme" error points to the tool not being able to connect to the Hornbill instance, probably due to one of the following: 

    • The (case-sensitive) instance ID defined in the configuration JSON is incorrect;
    • Access to files.hornbill.com and files.hornbill.co is blocked from the PC that you're running the tool from.

    If you can send me your configuration file (with the API keys removed), I'll take a look at the config to make sure everything is ok there.

    Thanks,

    Steve

     

  2. Hi @Izu,

    Thanks for sending your config to Support. I've had a look, and it's configured to log a new Service Request at 9 am every day:

    image.png

    The executable should be left running, and if it's running at 9 am then the new Service Request should be logged as expected. If you want to test this at a time that isn't 9 am, just change the CronSchedule parameter as required for testing, run the executable, then kill the executable before resetting the CronSchedule and re-running it.

    Let me know how you get on with this.

    Steve

  3. Hi @Izu,

    The API Scheduler is a self-contained scheduler utility, hence it running indefinitely. Its purpose is not to make one-off API calls, but to make API calls as part of a CRON schedule, so it's entirely possible that the schedule hasn't reached its defined point when the utility is being executed. More information can be found on the API Scheduler wiki page .

    If you send me a copy of the configuration file you're using (with the API Key removed), and the schedule that you want to make the API calls, then I'll have a look at whats configured and will point you in the right direction.

    Thanks,

    Steve

  4. Hi @Nick Brailsford,

    Importing the assets from Supportworks is just like importing then from an SCCM database (which there examples of in the documentation), just the driver, SQL query and mappings will be different :) 

    As you're pulling data from Supportworks, the database driver should be:

    • If you're using Supportworks 7.x and Core Services 3.x then set the driver to be swsql
    • If you're using Supportworks 8.x and Core Services 4.x then set the driver to be mysq

    I actually wrote the tool, so if you have any questions about the import as you're building your config or running the tool, feel free to tag me in the post and I'll help where I can!

    Cheers,
    Steve

  5. @Martyn Houghton

    Yeah that's how it'll work I'm afraid, there's no way currently to add that time to newly created timers for the imported requests. I'm not sure it's even possible to start a timer in the past (so from the log date of the original request being imported), it'd probably need changes to both the application and platform so certainly won't be a quick fix... And that still wouldn't cater for time that the request has already spent on hold :( 

    I'll mull it over and discuss internally to see if there's something that we can do.

    Cheers,
    Steve

  6. Hi @Martyn Houghton,

    This is done via the BPM. Once a request imported, if the request has a Service against it then the BPM defined against the Service BPM is spawned for the request. In that BPM, when a Start Timers node is reached the code goes and works out the Service Level from the Service assigned to the request, as well as the Service Level Rules (and relevant request data), then populates appropriate timers.

    Cheers,
    Steve

  7. Hi @Martyn Houghton,

    The visibility can be set using the h_updatetype column in the HistoricUpdateMapping. The historic updates functionality was originally based on the Supportworks call diary visibility, which was using different bits in the value to work out update type and visibility.

    Bit 512 was the "hide from the portals" bit, so if you set a value of 1 in h_updatetype then the update will have a "public" visibility (visible by customers who have access to the request in the portals), and if you set a value of 513 then the update will be private, so not visible in the portals.

    Cheers,
    Steve

  8. @Martyn Houghton

    Just a quick note - I've written and released a generic request import tool, which you will be able to use to import your Desk.com data from using ODBC (so can access CSV, XLS data etc), or a direct database source. Details and link to download are on the Hornbill wiki

    This is the tool that should be used going forward, as the previous ODBC-only tool that only reached beta will be deprecated.

    Unlike the previous ODBC import tool, this utility doesn't require a merge of your request data and historic update data into the same CSV or XLS worksheet, so it's much easier to get your data to an import-ready state. With Excel spreadsheets, the request and diary update data are stored in different worksheets, and with CSV files they can be stored in completely different CSV's - just as long as the CSV's are in the same folder and the ODBC connection to them is configured as such. I've supplied demo config files for SQL, CSV and XLS imports as part of the release, so it should hopefully be self-explanatory, have a look at how the RequestQuery and RequestHistoricUpdateQuery params are structured.

    Give me a tag on here if you have any issues with it :)

    Cheers,
    Steve

  9. Hi @grahambird80,

    The value of Image > URI is incorrect. It's currently set to thumbnailPhoto, but this needs wrapping with square brackets [thumbnailPhoto] otherwise no data mapping will take place and the tool will assume the string thumbnailPhoto is the image data :) :

    image.png

     

    So in your config in the admin console should look like this:

    image.png

     

    So correcting that should fix your issue, as long as the image data stored in the thumbnailPhoto attribute is JPG.

    Let me know how you get on.

    Steve

  10. @Martyn Houghton

    One of my colleagues recently released an ODBC request import tool, that can use a CSV as its data source, so that may do what you need. It's in beta at the moment, but it lives here if you wanted to take a look: https://github.com/hornbill/goODBC_RequestImport

    I'll have a look at the code and see how far away it is from being promoted out of beta. Will let you know once done.

    Thanks,
    Steve

  11. Hi @Martyn Houghton,

    I don't think we've had a requirement to import requests from a Desk.com instance as yet, but it should be relatively straightforward to build something to help you do that. As Desk is in the cloud, do you have access to the data directly (for example a dump of the database that we can read records from), or would you be looking to grab the data via their REST APIs?

    Cheers,
    Steve

  12. Hi @Izu ,

    From the Cleaner Tool Documentation

    Quote

    RequestServices : An array containing Service ID Integers to filter the requests for deletion against. An empty array will remove the Service filter, meaning requests with any or no service associated will be deleted

    So this will be an array of Service ID's, which are integers and can be found in the URL of the Service Record page:

    image.png

    So once you've found those service ID's, you can build one config file per retention policy, and schedule them accordingly.

    Kind regards,
    Steve

  13. @Darren Rose @Victor,

    I've not come across this issue before, but a quick Google shows that lots of other people have seen this error when using all different data sources (not just R, but using SQL, CSV, XLS etc). It's not an issue with our import scripts, it's a problem with Power BI itself. Seems that the data type of one of the columns doesn't like the data being presented to it, hence the type mismatch error, so the affected column type will need to be changed in the query editor in Power BI. Take a look at this thread for more information:

    https://community.powerbi.com/t5/Desktop/Data-refresh-error-OLE-DB-or-ODBC-error-Exception-from-HRESULT/td-p/39691

    If you can't identify which column is causing the issue, then you may need to delete the table in Power BI and re-import the data. Hopefully, it won't come to that though!

    Let me know how you get on with this.

    Thanks,

    Steve

     

  14. Hi @HHH ,

    The requested integrations should now be available on your instance :)

    When adding an integration node to a workflow using the Hornbill Integration Bridge as the defined connector, you can now create, update or retrieve Ideas against your products in Aha! from your Hornbill workflows: 

    image.png

    Documentation for this integration can be found on our wiki.

    Let me know how you get on with these integrations!

    Thanks,

    Steve

    • Thanks 1
  15. Hi @HHH ,

    Would you be using Aha! Ideas as "development proposals" in the Aha! platform? We could build integrations that are used within your Hornbill workflows, to create and update Ideas when requests raised in Service Manager reach a certain point in their workflow. And using the response from the Idea creation, you could then write a link to the Aha! Idea in the Service Manager request.

    Is this the sort of integration you're looking for? And are there any other Aha! integrations that you'd want us to take a look at?

    Thanks,

    Steve

    • Like 1
  16. Hi @Keith,

    Sorry, I must've missed this post. Getting Flow to fire API calls off to Hornbill is fairly trivial (albeit not very elegant) with the HTTP action. Following this up with a Parse JSON action containing the relevant schema means that you can read/use the API response too. It's not pretty, but it works...

    Here's an example that will log an Incident in Service Manager, and how the HTTP and Parse JSON actions should be configured (with the instance name and API key hidden). This will make available the requestId, summary and warnings API output params available to the rest of your flow.

    The XML for the XMLMC call needs to be manually added to the body of the POST action, with details (summary and description) being populated using inputs to the flow:

    image.png

    The JSON schema to parse the output of logIncident, as an example, is: 

    {
        "type": "object",
        "properties": {
            "@@status": {
                "type": "boolean"
            },
            "params": {
                "type": "object",
                "properties": {
                    "requestId": {
                        "type": "string"
                    },
                    "summary": {
                        "type": "string"
                    },
                    "warnings": {
                        "type": "string"
                    }
                }
            },
            "flowCodeDebugState": {
                "type": "object",
                "properties": {
                    "step": {
                        "type": "string"
                    },
                    "executionId": {
                        "type": "string"
                    }
                }
            }
        }
    }

    Hope this helps,

    Steve

×
×
  • Create New...