Jump to content

Search the Community

Showing results for tags 'sql'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


  • Hornbill Platform and Applications
    • OpenForWork
    • Announcements
    • Blog Article Discussions
    • General Non-Product Discussions
    • Application Beta Program
    • Collaboration
    • Employee Portal
    • Service Manager
    • IT Operations Management
    • Project Manager
    • Supplier Manager
    • Customer Manager
    • Document Manager
    • Timesheet Manager
    • Live Chat
    • Board Manager
    • Mobile Apps
    • System Administration
    • Integration Connectors, API & Webhooks
    • Performance Analytics
    • Hornbill Switch On & Implementation Questions
    • GRC Manager
  • About the Forum
    • Announcements
    • Suggestions and Feedback
    • Problems and Questions
  • Gamers Club's Games
  • Gamers Club's LFT

Find results in...

Find results that contain...

Date Created

  • Start


Last Updated

  • Start


Filter by number of...


  • Start








Website URL





Found 14 results

  1. In order to undertake more complex measures and to be able or retrospectively regenerate them, can we request an enhancement to be able to insert the 'Date Ranging Column' start and end date/time stamps in the 'Query where clause'. For example we want to set a measure that obtains the number of open requests at the end of a period, so we would do a count on requests where the Data Logged is before the end of the period and the date closed is null or after the end of the period. Cheers Martyn
  2. I am trying to work out whether it is possible in either Analytics Measures or Widgets to get a count of distinct external organisations which have logged request in the last 12 month. I can get a count of requests logged in the last 12 month withe the following:- h_org_id<>0 and h_datelogged>(DATE_SUB(now(), INTERVAL 12 MONTH)) h_org_id<>0 and h_datelogged>(DATE_SUB(now(), INTERVAL 12 MONTH)) But as I am not able to write the SQL and insert the distinct(h_org_id) or group by on a count, I struggling to work out how to get this to be displayed as either a measure or widget. Cheers Martyn
  3. Excuse my limited sql knowledge. I am trying to run a report in the reports module against outstanding requests older than X days. The preset variables do not allow for this specific detail so I would like to use the "customer criteria" option but simply pasting in some sql doesn't seem to be working for me. Is anyone able to help. Using this SQL. h_datelogged < DATE_SUB(CURDATE(), INTERVAL 30 DAY) and configured as per attached. The report is still resolving requests logged in less than 30 days in this example.
  4. Can I request an enhancement to allow the restriction introduced in ESP Build 3237 - "Enforced checks for Database Direct to only allow SELECTs for production instances" to be configurable by a permission or system setting, as we often use additional SQL commands in the Admin Tool Database Direct to correct or update records especially when undertaking data migrations into our Hornbill Instance. Cheers Martyn
  5. We are experiencing an issue when trying to setup and run the SQL User Import, to upload new users from LDAP on a regular basis. We are getting the attached error: We’ve looked on the forum and there was one instance of this, but appeared to be resolved over 12 months ago with a change to the routine on the Hornbill side. Can someone from the support team or integration experts please look at this? thanks Neil
  6. Hi there, We use Service Manager to manage our main support inbox but for a period the SM version of the system was not being used (mail still coming in) and there are now a large number of messages that remain unread between 6 and 9 months back. At the moment we're incrementally going through and marking them as read manually as and when we can spare time but for some reason you're no longer able to tick the top email and control click the bottom one and then mass update them (mark as read in our case). It seems if you scroll too far your original selection is lost and this process can only be done for about a hundred messages (am told this used to work differently and this was formerly possible but don't know that for myself). Please can I understand if there are any ways to run SQL on the back end to programmatically update the message statuses or if anyone knows of a better way to do this via the front end UI. The only place this is an issue are the various badges that appear on email icons indicating "99+" etc emails are unread and we'd like that to return to the actual number of emails we've not yet read so we get some value from it. We currently have about 30k emails to mark as read and this would take someone about a day to do manually using our current best method, any thoughts appreciated! Thanks in advance. Kind regards, Oscar
  7. Would it be possible to have a type of Simple List source for use in progressive capture questions and custom fields which uses a SQL query to generate the available options? The reason for asking is that we are looking to log internal project management requests in the system, but they will relate to an external customer(contact) organisation. So rather than having to maintain a list of organisations in a simple list, I would, in this case, use a SQL query to obtain a list of organisations from the existing internal table where the industry type is a certain value. Cheers Martyn
  8. Hi, When exporting a CSV from a SQL query in Database Direct there aren't any text qualifiers as there are when exporting a CSV from the requests list This causes problems when there are blank fields and where there a commas in the text fields as it puts the items in the wrong columns Would it be possible to change that or give the option to use text qualifiers? thanks, Pete
  9. Hi, I'm trying to do a SQL query in the Database Direct with a join but I'm having a bit of a problem I want to return all data in h_itsm_requests that matches records in h_itsm_questions where h_question_id = 'citrixserver' but I'm not getting any data It's either an issue with the joining or because I have criteria on both tables This next one returns 52 records but there should be 134, it's only returning records where the criteria on both tables is matched I've tried left join and right join but it hasn't made a difference In need to return everything that matches the following query and return h_answer from h_itsm_questions where h_question_id = 'citrixserver' Can anyone help? This level of SQL is a bit beyond me
  10. At the moment you have to use a 'Measure' as the source of the data for the Target Counter Widget in Advances Analytics. Though you can set the Widget to refresh as fast as every minute, the measure you have to use as the source can only be updated as frequently as each day. It would be useful to be able to use custom SQL query with the Target Counter Widget, so allowing you to use these more in real time. The example we wanted to use was for a 'System Health' dashboard which would pick up in near real time requests which are unassigned, requests at New status and not picked up within x minutes etc. Cheers Martyn
  11. Hi all, Not sure where to place this, apologies if this is the wrong (sub)forum. I am noticing some inconsistencies with comma placement in the from database direct exported .CSV's. When running this simple query: SELECT * FROM h_cmdb_assets as a LEFT JOIN h_cmdb_assets_computer_peripheral as b on a.h_pk_asset_id = b.h_pk_asset_id WHERE a.h_name LIKE '%Jabra%' order by a.h_pk_asset_id ASC limit 250 I get the following data returned: As you can see the the retired_date column is only filled in on the first record. When exporting this through the CSV button I get the following .CSV (opened in Notepad++ before even touching it with Excel). I took the liberty of "tabbing" the plain text for readability As you can see the exported file seems to be missing a few commas. This happens throughout this document and other documents exported this way. Does anyone else experience the same problems? Thanks, Alex
  12. Hi guys, I have a bit of a random / genuine question: why is Hornbill running on top of a mySQL database? The reason of my question is simply because I have been using MS SQL Server for years now and I find it very very reliable, good performances, low maintenance and very good with high volumes of transactions & data. And MySQL never really hit me as being the best database engine out there. Also, I keep reading on the forums about being able to schedule reports and export them to other formats than CSV or HTML. This is super easily achieved with SSRS. I am loving the architecture and technology used for Hornbill. It is really great stuff you guys are doing. But the database side still eludes me a bit I must admit
  13. We imported all our old data from SupportWorks in Service Manager but the way we were using SupportWorks we didn't often resolve calls we used to just close them instead We've now changed the way we're handling calls and we set them to resolved and then let the customer close them or let them auto close after x days In need to report on the resolved date for the calls which is ok for anything we've closed within Service Manager but the h_dateresolved field is blank for thousands of records Would it cause any issues to run an update query to set h_dateresolved to use the data in h_dateclosed for closed and resolved calls where h_dateresolved is blank? Is this the SQL I would need to use? UPDATE h_itsm_requests SET h_itsm_requests.h_dateresolved = h_itsm_requests.h_dateclosed WHERE h_status = 'status.closed' AND h_dateresolved IS NULL
  14. At the moment we have a number of saved sql queries in Database Direct which is very useful for running common queries, however one issue at the moment is that in order to view the sql you have to run it, as it runs as soon as you select it. This is fine when only doing selects, but as we have a number of SQL update queries we run in relation to imported data, it would be good to have the option to open or edit the saved SQL without having to run it. Cheers Martyn
  • Create New...