Jump to content

Steve G

Hornbill Developer
  • Posts

    748
  • Joined

  • Last visited

  • Days Won

    31

Everything posted by Steve G

  1. Hi @samwoo, It's a function that natively parses and retrieves data from JSON objects stored in a database column - h_buz_activities.h_extra contains some JSON data, so it's just a way of getting to its properties via SQL. Documentation for the function can be found in the MariaDB knowledgebase: https://mariadb.com/kb/en/library/json_value/ Cheers, Steve
  2. Hi @Martyn Houghton , The primary source IP is 87.117.243.10. DR currently has non-assigned IP's, but we're in the process of changing this so we'll have a fixed IP for the DR data centre, which will be in the next month or so. I'll reply to this post again once I have that IP. Cheers, Steve
  3. Hi @chriscorcoran, What you need is actually possible with reporting when using JOINs and some funky custom criteria to define the JOIN clauses. I've built and attached a basic report that will return a list of requests, and against each request returned it will output the first and latest timeline updates for emails received: requests-with-email-updates.report.txt The custom JOINs are using sub-queries to retrieve the unique IDs for request timeline posts that had an email as its source, where the email was sent to a specific email address (you need to replace yourservicedesk@email.com with the relevant email address against BOTH actA and actB joins): Note, I've also added a filter to the report, so that it will only return requests that have had a timeline update via email. Feel free to add other filters and/or columns to the report to retrieve the data you need. Let me know how you get on with this. Cheers Steve
  4. Hi @Martyn Houghton, This is set using the h_updatetype property of the HistoricUpdateMapping section within the import config. This is an integer column, and the code looks for particular bits being set against the integer using bitwise operators to work out the update type and visibility for each row. Essentially, setting bit 1 marks the historic update type as User, and 512 marks a historic update as private, so setting the integer as 513 will be a private user update. This old post comment from the Supportworks forum describes the values in more detail: Kind regards, Steve
  5. Hi @Martyn Houghton, The logging of iBridge method invocations and errors to the ESPServerService log has been added to the platform and will be available in the next platform build (>2975). Cheers, Steve
  6. Hi @Paul Alexander , Just on the back of the reply from James, the last run date and status have both been added, and will be available when the next update hits your instance: Kind regards, Steve
  7. Hi @Izu, I've added a debug mode to the API Scheduler tool, which when enabled will output the API call request and response payloads to the log file. This should help in working out what's going on with the API calls that are being made, or identify the new records being created. This release (v1.2.0) can be downloaded from here: https://github.com/hornbill/goAPIScheduler/releases/download/1.2.0/goAPIScheduler_v1_2_0.zip No change to your configuration is required, you just need to replace the executable with the relevant one in the release, then when running the executable add the -debug=true flag to the command. Let me know how you get on. Cheers, Steve
  8. Hi @Izu, One way would be to create a custom view on your request list, filtered by the Status, Service and Catalog Item that you have defined in the API scheduler configuration. You should then have those requests listed. I'll have a look at the tool and see if we can add the API response body to the log for a successful API call. It won't be particularly elegant, but you'll then be able to see in the log any params returned by each API call (including the Request Reference from your call to logServiceRequest). I'll reply to this thread once it's done, Thanks, Steve
  9. Hi @RobD , You can use basic wiki markup in the authorisation details, but you need to change the modifier in the email template to wiki: Currently, only basic formatting (Bold, Italic, Ordered & Unordered lists) are supported. Please see the Variables > Modifiers section of the email templates wiki page for more information: https://wiki.hornbill.com/index.php/Email_Templates Cheers, Steve
  10. No worries @Joshua T M, glad to hear that's up and running! This is the API you need to retrieve attachment details: https://api.hornbill.com/apps/com.hornbill.servicemanager/Requests?op=smGetAttachments Cheers, Steve
  11. Hi @Izu, Once you'd fixed the cron expression, did you restart the executable? The config is only loaded when the executable first starts, so the executable will need to be killed and restarted to take the changes. Once a schedule is reached as defined in your expression, you will see a message to the console, and in the log, to state if the API call was successful or otherwise. Thanks, Steve
  12. HI @Giuseppe Iannacone, Ah ok, I see what you mean now There's no setting for this currently, but I'll feed this enhancement request into our development team. Cheers, Steve
  13. Hi @Giuseppe Iannacone, The example that @James Ainsworth has provided above will prevent duplicates based on a matching serial number - if an identical serial number is found, then the asset details will be updated if any of the fields do not match their existing values. Are you looking for functionality in the tool to not allow updates if a match is found? Or for functionality in the UI to prevent analysts from creating assets with identical serial numbers? Thanks, Steve
  14. Hi @Joshua T M, Apologies, it would have helped if I'd have pasted the link https://api.hornbill.com/apps/com.hornbill.servicemanager/Requests?op=getActivityStreamAccessToken Cheers, Steve
  15. Hi @samwoo There's already a tool that runs Hornbill reports and dumps the files to a network share or folder of your choice It's up on GitHub, and documented on the wiki. Cheers, Steve
  16. Hi @Joshua T M, This is possibly down to access rights, as no activity stream access token is being passed to the API call. In your script, if you call the following API to generate an access token, then pass that token into the accessToken parameter of your call to activityStreamQuery you should then see the activities returned. Cheers, Steve
  17. Hi @Izu, The issue is with the CronSchedule expression defined in your configuration: "CronSchedule":"0 0 9 * * 1-7", You have 1-7 defined as the days of the week to make the API call at - this is incorrect and is causing the scheduler to fail. It should be 0-6 to define every day of the week (with 0 being Sunday and 6 being Saturday) - or you could replace this with a *. Please see the following extract from the Cron package documentation for more information: Field name | Mandatory? | Allowed values | Allowed special characters ---------- | ---------- | -------------- | -------------------------- Seconds | Yes | 0-59 | * / , - Minutes | Yes | 0-59 | * / , - Hours | Yes | 0-23 | * / , - Day of month | Yes | 1-31 | * / , - ? Month | Yes | 1-12 or JAN-DEC | * / , - Day of week | Yes | 0-6 or SUN-SAT | * / , - ? I'll also make a change to the API Scheduler tool to catch and throw errors for these type of configuration errors, I'll post back here once done. Kind regards, Steve
  18. Hi @samwoo @alecwa, I released a new tool last night that will run a report in Hornbill, read the CSV output and insert or update the records into a local database table of your choice. Once the data is in there, you can then point your Power BI reports to your database as its data source instead of using the R scripts. The new tool is documented on the Hornbill wiki, and the release & source code is on Github. It supports the writing of data to MySQL Server v4.1+, MariaDB (all) or SQL Server 2005+. Once you have a database table set up to hold your request data (I can provide you with a MySQL create table statement for the Service Manager requests table, if it'll help), you can then use this tool to run reports on Hornbill to grab and populate the new table with all the historical data you need. Once you have the bulk of your data in, you can then schedule this tool to run incremental reports to insert new or update existing request records - use the h_datelogged and h_datelastmodified column from the Requests table to identify new or updated records when building your incremental reports. Let me know if you need any pointers. Cheers, Steve
  19. Hi @Izu, Yes, this works in the same way, by associating groups to the imported users. Please see the wiki page for more information: https://wiki.hornbill.com/index.php/LDAP_User_Import#Associating_a_Group_to_Hornbill_User_Accounts Cheers, Steve
  20. Hi @Izu, Are you allowing the scheduler to continue running in the background after it's been executed? As it does need to be running at the time when the Cron schedule is due. If so, and the API calls are still not being made, could you email me a copy of your current configuration file & the latest log file from the tool, and I'll see if there's anything we've missed. Thanks, Steve
  21. Hi @alecwa , I've had a look to see if providing incremental imports against an existing data set is feasible with Power BI, and it doesn't appear to be. The Power BI incremental import functionality is restricted to SQL Query Imports for Power BI Pro subscribers, and I'm not sure it's possible to script this in M as part of the data source refresh. One way around this would be for us to provide you with an open source tool that can execute reports in Hornbill and retrieve the output, before upserting the retrieved records into a database that you host & maintain. So you'd execute the tool against a catch-all report (one or more times depending on the size of your requests table) to populate your local database with all request data up until the point of execution, then have the tool run another report on a regular schedule to grab any requests that have been added or updated between executions. This would be a much more efficient approach than importing your ever-expanding request data in its entirety multiple times per day, and would mean you have a near-real-time copy of your request data hosted locally for Power BI to query as and when you need. If I remember correctly, this is how you performed your Power BI reporting against your Supportworks data prior to going live with Hornbill? Would this approach work for you? @samwoo Would this approach work for you also? Let me know your thoughts. Steve
  22. Hi @Joanne, I've had another look, and there was an issue mapping data from records when using certain drivers which was causing your issue. I've fixed the code in the tool and just released it to Github. The release package can be downloaded directly from here: https://github.com/hornbill/goDBAssetImport/releases/download/1.7.2/goDBAssetImport_v1_7_2.zip It's just the executable that you need to replace, as your configuration file shouldn't need to change. Let me know how you get on with this. Thanks, Steve
  23. Hi @Giuseppe Iannacone , To add the warranty start and expiry dates to your records, you need to map values from the query into the h_warranty_expires and h_warranty_start columns in the AssetGenericFieldMapping section of your configuration. These need to be populated by a date/time string returned as part of your SQL query in the format YYYYY-MM-DD HH:MM:SS If you take a look at the example conf_computerPeripheral.json file supplied with the Database Asset Import Tool release package, and the Database Asset Import documentation on our Wiki , this should give you a good starter for 10 on how to import monitor assets into Hornbill from your SCCM database. I hope this helps, let me know if you need any more information. Steve
  24. Hi @samwoo, I'm unable to replicate that issue What version of the asset import tool were you using before this attempt? That error suggests you are on an older version of the tool where the AssetTypes property in the JSON config was a flat object, whereas in v1.5.0 or above the config layout has changed so that this is an array. Thanks, Steve
  25. Hi @samwoo, That's been fixed in v1.7.1 of the database asset import utility, which can be downloaded here. This was actually working as expected for new assets being imported, it was pre-existing assets that were not being updated correctly, which has been rectified. Let me know how you get on with it. Cheers, Steve
×
×
  • Create New...