Jump to content

Steve G

Hornbill Developer
  • Posts

    745
  • Joined

  • Last visited

  • Days Won

    30

Everything posted by Steve G

  1. Hi @Martyn Houghton, This is set using the h_updatetype property of the HistoricUpdateMapping section within the import config. This is an integer column, and the code looks for particular bits being set against the integer using bitwise operators to work out the update type and visibility for each row. Essentially, setting bit 1 marks the historic update type as User, and 512 marks a historic update as private, so setting the integer as 513 will be a private user update. This old post comment from the Supportworks forum describes the values in more detail: Kind regards, Steve
  2. Hi @Martyn Houghton, The logging of iBridge method invocations and errors to the ESPServerService log has been added to the platform and will be available in the next platform build (>2975). Cheers, Steve
  3. Hi @Paul Alexander , Just on the back of the reply from James, the last run date and status have both been added, and will be available when the next update hits your instance: Kind regards, Steve
  4. Hi @Izu, I've added a debug mode to the API Scheduler tool, which when enabled will output the API call request and response payloads to the log file. This should help in working out what's going on with the API calls that are being made, or identify the new records being created. This release (v1.2.0) can be downloaded from here: https://github.com/hornbill/goAPIScheduler/releases/download/1.2.0/goAPIScheduler_v1_2_0.zip No change to your configuration is required, you just need to replace the executable with the relevant one in the release, then when running the executable add the -debug=true flag to the command. Let me know how you get on. Cheers, Steve
  5. Hi @Izu, One way would be to create a custom view on your request list, filtered by the Status, Service and Catalog Item that you have defined in the API scheduler configuration. You should then have those requests listed. I'll have a look at the tool and see if we can add the API response body to the log for a successful API call. It won't be particularly elegant, but you'll then be able to see in the log any params returned by each API call (including the Request Reference from your call to logServiceRequest). I'll reply to this thread once it's done, Thanks, Steve
  6. Hi @RobD , You can use basic wiki markup in the authorisation details, but you need to change the modifier in the email template to wiki: Currently, only basic formatting (Bold, Italic, Ordered & Unordered lists) are supported. Please see the Variables > Modifiers section of the email templates wiki page for more information: https://wiki.hornbill.com/index.php/Email_Templates Cheers, Steve
  7. No worries @Joshua T M, glad to hear that's up and running! This is the API you need to retrieve attachment details: https://api.hornbill.com/apps/com.hornbill.servicemanager/Requests?op=smGetAttachments Cheers, Steve
  8. Hi @Izu, Once you'd fixed the cron expression, did you restart the executable? The config is only loaded when the executable first starts, so the executable will need to be killed and restarted to take the changes. Once a schedule is reached as defined in your expression, you will see a message to the console, and in the log, to state if the API call was successful or otherwise. Thanks, Steve
  9. HI @Giuseppe Iannacone, Ah ok, I see what you mean now There's no setting for this currently, but I'll feed this enhancement request into our development team. Cheers, Steve
  10. Hi @Giuseppe Iannacone, The example that @James Ainsworth has provided above will prevent duplicates based on a matching serial number - if an identical serial number is found, then the asset details will be updated if any of the fields do not match their existing values. Are you looking for functionality in the tool to not allow updates if a match is found? Or for functionality in the UI to prevent analysts from creating assets with identical serial numbers? Thanks, Steve
  11. Hi @Joshua T M, Apologies, it would have helped if I'd have pasted the link https://api.hornbill.com/apps/com.hornbill.servicemanager/Requests?op=getActivityStreamAccessToken Cheers, Steve
  12. Hi @samwoo There's already a tool that runs Hornbill reports and dumps the files to a network share or folder of your choice It's up on GitHub, and documented on the wiki. Cheers, Steve
  13. Hi @Joshua T M, This is possibly down to access rights, as no activity stream access token is being passed to the API call. In your script, if you call the following API to generate an access token, then pass that token into the accessToken parameter of your call to activityStreamQuery you should then see the activities returned. Cheers, Steve
  14. Hi @Izu, The issue is with the CronSchedule expression defined in your configuration: "CronSchedule":"0 0 9 * * 1-7", You have 1-7 defined as the days of the week to make the API call at - this is incorrect and is causing the scheduler to fail. It should be 0-6 to define every day of the week (with 0 being Sunday and 6 being Saturday) - or you could replace this with a *. Please see the following extract from the Cron package documentation for more information: Field name | Mandatory? | Allowed values | Allowed special characters ---------- | ---------- | -------------- | -------------------------- Seconds | Yes | 0-59 | * / , - Minutes | Yes | 0-59 | * / , - Hours | Yes | 0-23 | * / , - Day of month | Yes | 1-31 | * / , - ? Month | Yes | 1-12 or JAN-DEC | * / , - Day of week | Yes | 0-6 or SUN-SAT | * / , - ? I'll also make a change to the API Scheduler tool to catch and throw errors for these type of configuration errors, I'll post back here once done. Kind regards, Steve
  15. Hi @samwoo @alecwa, I released a new tool last night that will run a report in Hornbill, read the CSV output and insert or update the records into a local database table of your choice. Once the data is in there, you can then point your Power BI reports to your database as its data source instead of using the R scripts. The new tool is documented on the Hornbill wiki, and the release & source code is on Github. It supports the writing of data to MySQL Server v4.1+, MariaDB (all) or SQL Server 2005+. Once you have a database table set up to hold your request data (I can provide you with a MySQL create table statement for the Service Manager requests table, if it'll help), you can then use this tool to run reports on Hornbill to grab and populate the new table with all the historical data you need. Once you have the bulk of your data in, you can then schedule this tool to run incremental reports to insert new or update existing request records - use the h_datelogged and h_datelastmodified column from the Requests table to identify new or updated records when building your incremental reports. Let me know if you need any pointers. Cheers, Steve
  16. Hi @Izu, Yes, this works in the same way, by associating groups to the imported users. Please see the wiki page for more information: https://wiki.hornbill.com/index.php/LDAP_User_Import#Associating_a_Group_to_Hornbill_User_Accounts Cheers, Steve
  17. Hi @Izu, Are you allowing the scheduler to continue running in the background after it's been executed? As it does need to be running at the time when the Cron schedule is due. If so, and the API calls are still not being made, could you email me a copy of your current configuration file & the latest log file from the tool, and I'll see if there's anything we've missed. Thanks, Steve
  18. Hi @alecwa , I've had a look to see if providing incremental imports against an existing data set is feasible with Power BI, and it doesn't appear to be. The Power BI incremental import functionality is restricted to SQL Query Imports for Power BI Pro subscribers, and I'm not sure it's possible to script this in M as part of the data source refresh. One way around this would be for us to provide you with an open source tool that can execute reports in Hornbill and retrieve the output, before upserting the retrieved records into a database that you host & maintain. So you'd execute the tool against a catch-all report (one or more times depending on the size of your requests table) to populate your local database with all request data up until the point of execution, then have the tool run another report on a regular schedule to grab any requests that have been added or updated between executions. This would be a much more efficient approach than importing your ever-expanding request data in its entirety multiple times per day, and would mean you have a near-real-time copy of your request data hosted locally for Power BI to query as and when you need. If I remember correctly, this is how you performed your Power BI reporting against your Supportworks data prior to going live with Hornbill? Would this approach work for you? @samwoo Would this approach work for you also? Let me know your thoughts. Steve
  19. Hi @Joanne, I've had another look, and there was an issue mapping data from records when using certain drivers which was causing your issue. I've fixed the code in the tool and just released it to Github. The release package can be downloaded directly from here: https://github.com/hornbill/goDBAssetImport/releases/download/1.7.2/goDBAssetImport_v1_7_2.zip It's just the executable that you need to replace, as your configuration file shouldn't need to change. Let me know how you get on with this. Thanks, Steve
  20. Hi @Giuseppe Iannacone , To add the warranty start and expiry dates to your records, you need to map values from the query into the h_warranty_expires and h_warranty_start columns in the AssetGenericFieldMapping section of your configuration. These need to be populated by a date/time string returned as part of your SQL query in the format YYYYY-MM-DD HH:MM:SS If you take a look at the example conf_computerPeripheral.json file supplied with the Database Asset Import Tool release package, and the Database Asset Import documentation on our Wiki , this should give you a good starter for 10 on how to import monitor assets into Hornbill from your SCCM database. I hope this helps, let me know if you need any more information. Steve
  21. Hi @samwoo, I'm unable to replicate that issue What version of the asset import tool were you using before this attempt? That error suggests you are on an older version of the tool where the AssetTypes property in the JSON config was a flat object, whereas in v1.5.0 or above the config layout has changed so that this is an array. Thanks, Steve
  22. Hi @samwoo, That's been fixed in v1.7.1 of the database asset import utility, which can be downloaded here. This was actually working as expected for new assets being imported, it was pre-existing assets that were not being updated correctly, which has been rectified. Let me know how you get on with it. Cheers, Steve
  23. Hi @Joanne , Apologies, I should have expanded "If you can send me your configuration file..." with "...in a private message on this forum" Either that, or you can use email: steve.goldthorpe@hornbill.com Kind regards, Steve
  24. Hi @Joanne, The "unsupported protocol scheme" error points to the tool not being able to connect to the Hornbill instance, probably due to one of the following: The (case-sensitive) instance ID defined in the configuration JSON is incorrect; Access to files.hornbill.com and files.hornbill.co is blocked from the PC that you're running the tool from. If you can send me your configuration file (with the API keys removed), I'll take a look at the config to make sure everything is ok there. Thanks, Steve
  25. Hi @Izu, Have you checked the scheduler tool log? Does that mention that the API call was successful, or failed? Thanks, Steve
×
×
  • Create New...