Jump to content

Steve G

Hornbill Developer
  • Content Count

    103
  • Joined

  • Last visited

  • Days Won

    7

Steve G last won the day on August 1 2017

Steve G had the most liked content!

Community Reputation

25 Excellent

About Steve G

  • Rank
    Senior Member

Profile Information

  • Gender
    Male
  • Location
    Hull, UK

Contact Methods

  • Website URL
    https://www.hornbill.com

Recent Profile Visitors

988 profile views
  1. Hi @Izu, Thanks for sending your config to Support. I've had a look, and it's configured to log a new Service Request at 9 am every day: The executable should be left running, and if it's running at 9 am then the new Service Request should be logged as expected. If you want to test this at a time that isn't 9 am, just change the CronSchedule parameter as required for testing, run the executable, then kill the executable before resetting the CronSchedule and re-running it. Let me know how you get on with this. Steve
  2. Hi @Izu, The API Scheduler is a self-contained scheduler utility, hence it running indefinitely. Its purpose is not to make one-off API calls, but to make API calls as part of a CRON schedule, so it's entirely possible that the schedule hasn't reached its defined point when the utility is being executed. More information can be found on the API Scheduler wiki page . If you send me a copy of the configuration file you're using (with the API Key removed), and the schedule that you want to make the API calls, then I'll have a look at whats configured and will point you in the right direction. Thanks, Steve
  3. Hi @Nick Brailsford, Importing the assets from Supportworks is just like importing then from an SCCM database (which there examples of in the documentation), just the driver, SQL query and mappings will be different As you're pulling data from Supportworks, the database driver should be: If you're using Supportworks 7.x and Core Services 3.x then set the driver to be swsql If you're using Supportworks 8.x and Core Services 4.x then set the driver to be mysq I actually wrote the tool, so if you have any questions about the import as you're building your config or running the tool, feel free to tag me in the post and I'll help where I can! Cheers, Steve
  4. @Martyn Houghton Yeah that's how it'll work I'm afraid, there's no way currently to add that time to newly created timers for the imported requests. I'm not sure it's even possible to start a timer in the past (so from the log date of the original request being imported), it'd probably need changes to both the application and platform so certainly won't be a quick fix... And that still wouldn't cater for time that the request has already spent on hold I'll mull it over and discuss internally to see if there's something that we can do. Cheers, Steve
  5. Hi @Martyn Houghton, This is done via the BPM. Once a request imported, if the request has a Service against it then the BPM defined against the Service BPM is spawned for the request. In that BPM, when a Start Timers node is reached the code goes and works out the Service Level from the Service assigned to the request, as well as the Service Level Rules (and relevant request data), then populates appropriate timers. Cheers, Steve
  6. Hi @Martyn Houghton, The visibility can be set using the h_updatetype column in the HistoricUpdateMapping. The historic updates functionality was originally based on the Supportworks call diary visibility, which was using different bits in the value to work out update type and visibility. Bit 512 was the "hide from the portals" bit, so if you set a value of 1 in h_updatetype then the update will have a "public" visibility (visible by customers who have access to the request in the portals), and if you set a value of 513 then the update will be private, so not visible in the portals. Cheers, Steve
  7. @Martyn Houghton , The Integration Connectors, API & Webhooks forum would be the best place. Cheers, Steve
  8. @Martyn Houghton Just a quick note - I've written and released a generic request import tool, which you will be able to use to import your Desk.com data from using ODBC (so can access CSV, XLS data etc), or a direct database source. Details and link to download are on the Hornbill wiki . This is the tool that should be used going forward, as the previous ODBC-only tool that only reached beta will be deprecated. Unlike the previous ODBC import tool, this utility doesn't require a merge of your request data and historic update data into the same CSV or XLS worksheet, so it's much easier to get your data to an import-ready state. With Excel spreadsheets, the request and diary update data are stored in different worksheets, and with CSV files they can be stored in completely different CSV's - just as long as the CSV's are in the same folder and the ODBC connection to them is configured as such. I've supplied demo config files for SQL, CSV and XLS imports as part of the release, so it should hopefully be self-explanatory, have a look at how the RequestQuery and RequestHistoricUpdateQuery params are structured. Give me a tag on here if you have any issues with it Cheers, Steve
  9. Hi @grahambird80 , Glad to hear that's working now, I'll go and correct the documentation Cheers, Steve
  10. Hi @grahambird80, The value of Image > URI is incorrect. It's currently set to thumbnailPhoto, but this needs wrapping with square brackets [thumbnailPhoto] otherwise no data mapping will take place and the tool will assume the string thumbnailPhoto is the image data : So in your config in the admin console should look like this: So correcting that should fix your issue, as long as the image data stored in the thumbnailPhoto attribute is JPG. Let me know how you get on. Steve
  11. @Martyn Houghton One of my colleagues recently released an ODBC request import tool, that can use a CSV as its data source, so that may do what you need. It's in beta at the moment, but it lives here if you wanted to take a look: https://github.com/hornbill/goODBC_RequestImport I'll have a look at the code and see how far away it is from being promoted out of beta. Will let you know once done. Thanks, Steve
  12. Hi @Martyn Houghton, I don't think we've had a requirement to import requests from a Desk.com instance as yet, but it should be relatively straightforward to build something to help you do that. As Desk is in the cloud, do you have access to the data directly (for example a dump of the database that we can read records from), or would you be looking to grab the data via their REST APIs? Cheers, Steve
  13. Hi @grahambird80, That error is being returned as the data being written to the image files in the session cannot be converted by the platform, so I wonder what data is being written into them... Can you please send me a copy of your config JSON from within the Debug tab of the data import config, and I'll make sure everything is as it should be with that. Kind regards, Steve
  14. Hi @Izu , From the Cleaner Tool Documentation: So this will be an array of Service ID's, which are integers and can be found in the URL of the Service Record page: So once you've found those service ID's, you can build one config file per retention policy, and schedule them accordingly. Kind regards, Steve
  15. Hi @grahambird80, This was caused by a bug in the tool, which I've fixed this morning. You can pull the latest version of the tool (v3.1.4) from here: https://github.com/hornbill/goLDAPUserImport/releases/tag/3.1.4 No changes are required to the config, so just replace the executable with your specific platform variant and it should just work Let me know how you get on with it. Cheers, Steve
×