Jump to content

Steve G

Hornbill Developer
  • Posts

    748
  • Joined

  • Last visited

  • Days Won

    31

Posts posted by Steve G

  1. HI @Lyonel,

    Looks like this may be an issue with the SQL driver library that I used in this tool. It appears to have an issue with SSO and hostname resolution in networks pushing IPv6 addresses. See this open issue for more details: https://github.com/denisenkom/go-mssqldb/issues/177

    Could you change your config so that the value of "SQL Conf > Server" uses the IP address of the SQL host instead of its hostname, and give that a whirl please?

    Thanks,

    Steve

  2. Hi @Lyonel,

    Apologies for the late response, I'm currently on holiday so haven't been checking the forum regularly :D 

    Looks like you've used the config supplied by @Victor, and within that authentication is set to SQL and not Windows. So if you've not supplied SQL credentials, then that is why the "database config not set" is being returned... Could you change authentication back to Windows and let me know if that works? If not then I'll try to get logged in tomorrow to take a look.

    Thanks,

    Steve

    • Like 1
  3. @clampj,

    The h_serial_number column is in the AssetsComputer entity, not the entity you have defined, when trying to search for assets of type Desktop - which is why the API call is failing. You just need to change the Entity to AssetsComputer for the types where you want to search against the h_serial_number column.

    Hope this helps,

    Steve

  4. @clampj ,

    On 8/15/2018 at 11:03 AM, clampj said:

    If possible can you advise any ETA on fixing this

    How about now :D 

    I had an epiphany last night and managed to fix this entirely in the tool, rather than having to make changes to Service Manager.

    The changes are documented on the Hornbill Wiki: https://wiki.hornbill.com/index.php/Database_Asset_Import

    As well as the Github page: https://github.com/hornbill/goDBAssetImport

    And the link for the new release (v1.5.0) download is here:  https://github.com/hornbill/goDBAssetImport/releases/download/1.5.0/goDBAssetImport_v1_5_0.zip

    As a TL;DR, I've removed the AssetIdentifier and SQLConf>AssetID params from the tool config, and bundled these into an enhanced AssetTypes section:

    • AssetTypes is now an array of objects instead of a flat object;
    • Each object contains the following:
      • AssetType - the Asset Type Name which needs to match a correct Asset Type Name in your Hornbill Instance (the left-string from the AssetTypes section of the previous release)
      • Query - additional SQL filter to be appended to the Query from SQLConf, to retrieve assets of that asset type (the right-string from the AssetTypes section of the previous release)
      • AssetIdentifier - an object containing details to help in the identification of existing asset records in the Hornbill instance. If the value in an imported records DBColumn matches the value in the EntityColumn of an asset in Hornbill (within the defined Entity), then the asset record will be updated rather than a new asset being created:
        • DBColumn - specifies the unique identifier column from the database query
        • Entity - the Hornbill entity where data is stored
        • EntityColumn - specifies the unique identifier column from the Hornbill entity

    So for each asset type, you can now define how the tool should look for matching records - the source column from the database query, and the entity (primary asset entity or related asset details entity - specific to the class of asset you're searching for) plus entity column. Note - if the DB column is NULL (so if you don't have a serial number in a returned record for instance), then a new asset will be created - so please make sure you cater for this in your query.

    There are examples in the readme, and on the Wiki, but if you need any help just tag me in here and I'll take you through it.

    Hope this helps!

    Steve

  5. Hi @clampj @Victor
    I've just had a look at the code and this is a defect (well - an unfinished feature that's partially made its way in!). It's only searching against the h_name field in the Assets entity for existing records, hence why it never finds a matching asset in Hornbill and duplicates the record.

    It's not going to be a straightforward fix - it will definitely need changes to the Service Manager app as well as the import utility. There are complications due to the limited Asset fields currently defined as searchable, plus the fact that the serial number field doesn't live in the main asset entity - it's in an asset type-specific related entity. I'll have a think about the best way to handle this and will post back here when I have a solution.

    Thanks,

    Steve

  6. @Lyonel

    I'll update the Wiki to mention that, thanks. 

    Looking at the code (as it was a LONG time since I wrote that :) ), those fields are mandatory, so I'll remove that in the next release when Windows is the authentication method. I've just stuck a letter "a" in my config for testing, and it does work ok.

    The connection issue is an odd one. What version of SQL Server are your asset records held on, and is it set to enforce encryption? If it's SQL Server 2008 R2 Service Pack 2 or above, can you try enabling encryption in the import tool SQLConf, and see if that helps? Encryption should only be set to false on SQL Server 2008 R2 or below, as they contained a defect with the handling of encrypted login records.

    Thanks,

    Steve

  7. Hi @Lyonel,

    I've just tested the latest version of the tool with Windows authentication to an MS SQL Server containing an asset database, and it works fine.

    What was the actual crash error message?

    One thing to note: the content of the username and password fields in the SQLConf section of the config are ignored if you choose Windows as your authentication method, as it actually authenticates using the Windows session account details that the tool is run within. So the account that actually runs the tool (either your account if you're running it locally, or the system, account used when scheduling the tool to run outside of an interactive session) needs access to your database. Just use empty strings for your values of Username and Password.

    Thanks,

    Steve

  8. Hi @JamieMews ,

    I can't replicate this issue (although I am seeing another issue with the latest version of Power BI Desktop, which I'll come to). A quick search of the Power BI Community pages reveals that this may be down to antivirus software causing permissions issues between Power BI and your R installation:

    https://community.powerbi.com/t5/Desktop/R-with-Power-BI/m-p/344077

    After updating my Power BI installation, I saw an issue where Power BI's security settings had been set to require approval for new native queries. This needed unticking before I could run any R script:

    image.png

    Let me know if this helps,

    Steve

  9. Hi @AndyHodkinsonPrincesIT,

    At the moment it is not possible to do first-character wildcard searches in the global search (you could search for S*62776 and this would return that request, but as you've raised, *62776 doesn't work). I've raised this with our development team who will look into the possibility of getting this added.

    In the meantime you can wildcard search in the request list Filter as so (as long as your selected views/filters allow the request to be seen): 
     

    image.png

     

    I hope this helps,

    Steve

  10. Hi @Martyn Houghton,

    You can actually already do this using the timeline option in the global search. 

    If you wanted to search for the string email in change request  CH00000123, you would enter the following into the query input:

    email AND reqref:CH00000123

    This would return all timeline entries from that specific request, that contain the word email.

    Kind regards,

    Steve

  11. Hi @Bob320,

    Aside from the CORS check issue, I noticed that you've mentioned you'll be making XMLMC calls using an API key for a generic user in the frontend JavaScript. This isn't very secure - anyone who has access to the web app (and therefore can see the API calls being made in the browser developer console) will be able to read the API key and use it to initiate any API calls that the generic user has rights to... I'd suggest you keep the API calls server-side, to protect the key from your web app users.

    Kind regards,

    Steve

  12. Hi @Dan Munns,

    Spotted the issue with your config. The input param names for all APIs in Hornbill are case sensitive, so serviceID should be serviceId, and catalogID should be catalogId.

    The following config worked for me, raising a Change Request every minute while it was running, with the summary, service and catalog item all correctly added:

    {
      "APIKey": "redacted",
      "InstanceID": "redacted",
      "Zone": "eur",
      "Schedule": [{
        "Enabled": true,
        "CronSchedule": "0 * * * * 1-5",
        "ScheduleFrom": "2018-07-03T13:00:00.000Z",
        "ScheduleTo": "2020-01-01T00:00:00.000Z",
        "Service": "apps/com.hornbill.servicemanager/ChangeRequests",
        "API": "logChangeRequest",
        "APIParams": {
          "0": {
            "Type": "Content",
            "Parameter": "summary",
            "Content": "Some summary text"
          },
          "1": {
            "Type": "Content",
            "Parameter": "description",
            "Content": "Some descriptive text"
          },
          "2": {
            "Type": "Content",
            "Parameter": "serviceId",
            "Content": "568"
          },
          "3": {
            "Type": "Content",
            "Parameter": "catalogId",
            "Content": "132"
          },
          "4": {
            "Type": "Content",
            "Parameter": "catalogName",
            "Content": "Change Request Config Item"
          }
        }
      }]
    }

    Hope this helps,

    Steve

  13. Hi @Martyn Houghton,

    Apologies for the ambiguous description against those settings, but it's actually the HornbillITSM index that requires reindexing for the Knowledge Centre to function correctly,  not the Knowledgebase index. Please select "All" when running this reindex, as each index document needs to include new columns, so a "New" reindex won't cut it. Also, please ensure that a Page Size of 10 is selected when you run this reindex.

    While I was looking at this issue, I also noticed that the reindex of your HornbillITSMTimeline index was cut short due to a platform update starting during the job. So once the reindex of HornbillITSM is complete, please start another reindex of the HornbillITSMTimeline, ensuring you select "New" (as you don't want to re-index everything that's already been done :) ), and keep a Page Size of 10. And once this is complete, all your request timeline posts and comments will then be searchable in the Global Search :)

    Let me know if you have any issues with the above.

    Thanks,

    Steve

  14. Hi @MizeelA,

    This appears to be an issue with Power BI in the cloud not being able to authenticate against your locally installed Power BI Gateway, which is required when scheduling the running of reports that get data using R scripts. You need to check the configuration of your locally installed gateway and ensure that Power BI in the cloud can communicate and authenticate with this. See the following pages from the Power BI website for more information:

    https://powerbi.microsoft.com/en-us/blog/visualizing-and-operationalizing-r-data-in-power-bi/

    https://docs.microsoft.com/en-us/power-bi/service-gateway-personal-mode

    Kind regards,

    Steve

  15. Hi @Dan Munns,

    Sorry I thought I'd already replied to this. The asset importer will update any assets it finds where the value contained within the field defined in the AssetID object in the config matches an asset value in Hornill, where the Hornbill table column is defined by the AssetIdentifier object in the config. So where the value returned by Asset.Name here:

    image.png

    Matches a value in the main Hornbill assets table column h_name, as defined here:

    image.png

    Which, if the AssetGenericFieldMapping is configured as so, should find matches on the second+ run of the import tool:

    image.png

    So as long as the tool finds a match, and the operational or record state of the record (or indeed any of the other mapped columns) contains a different value,  then the record will be updated. If no match is found, then a new asset record is created.

    Kind regards,

    Steve

  16. Hi @Dan Munns,

    Apologies for the late response, I've been on leave :) 

    Great to see that you're using the tool and it's working for you!

    On 26/03/2018 at 3:57 PM, Dan Munns said:

    Edit: Whilst I remember, is there a way I can set up another task to set Quarantined machines (Snow) to Archived with the tool? (maybe not quite as 'complete' as I said :))

    Yes this can be done in the tool, by setting the value of one (or both) of the following fields in the AssetGenericFieldMapping section of the config JSON, but this means you will need one JSON config file defined per record state being imported as the values will need to be hard coded in the JSON:

    h_record_state : this is the State field from the asset record, and will take the following integer values:

    • 0 - Current
    • 1 - Active
    • 2 - Archived

    h_operational_state : this is the Operational State field from the asset record, and will take the following integer values:

    • 0 - Operational
    • 1 - Pre-Production
    • 2 - Retired

    So if you include a state clause in your asset type filters to only return assets with a specific status (Status eq 'Active' for example)to match the hard-coded values set in h_record_state and/or h_operational_state,  then you should see the correct statuses against the imported assets.

    In this example, all assets imported/updated by the tool using this config file would have a state of Archived and an operational state of Retired.

    image.png

    I'll add support for record & operational status mapping in to the tool when I get chance, so you can roll these up in to the same config. Will let you know once that's available.

    Kind regards,

    Steve

  17. Hi @Oscar Stankard,

    We are in the process of making a number of enhancements to how the request search works, but in the meantime did you know about the Ctrl+Shift+F shortcut? Hitting this button combination from anywhere within Hornbill will present you with a Quick Search box, allowing you to quickly get to a Request that you know the reference for.

    Just type or paste in a Request ID, and click Open Request to be taken straight to it :)

    Kind regards,

    Steve

    image.png

  18. Hi @samwoo,

    I've had a look at the CSV asset importer, and defined the h_location_type column to be populated using the locationType mapped column from a dummy CSV file, and the location type is populated correctly.

    So from the config file:

    image.png

    And the CSV & resulting asset once the import has ran:

    image.png

     

    I've also tried hard-coding the h_location_type value in to the config JSON, but still the field is populated as expected:

     

    image.png

    So I'm not sure why your assets are not populated with the location type I'm afraid. If you want to PM me a snapshot of your CSV, and a copy of your import config, then I'll take a look and see if I can spot what's different for you.

    Kind regards,

    Steve

×
×
  • Create New...