Jump to content
alecwa

Maximum Report Cell Count

Recommended Posts

Hello,

It's been a real pain since we implemented Hornbill that the max cell count on report exports is 25000. Our Service Manager had to spend 2 days appending reports to set up our Power BI dashboard for 2019, which she really shouldn't have.

Can you let us know when this will be increased or ideally set to unlimited?

Alternatively... Will we be getting a direct API connection for Power BI or anything similar which would get us around this problem?

Thanks,

Alec

2019-01-04 16_23_53-Administration _ Advanced.png

Share this post


Link to post
Share on other sites

@alecwa

Can I understand more about what you are trying to achieve? It sounds like to me that you are wanting to do an export rather than run a report?  If so, reporting is not really the right route for this.  

We set a maximum of 25000 rows on a report because we thought that was a reasonable upper limit for a report, we also place this limit because we did not expect reporting to be used for "unlimited data size exports" because reporting is simply not designed for that purpose. 

If I can understand better what you are trying to achieve we can probably give you guidance on the correct approach for what you are trying to achieve. 

Gerry

 

Share this post


Link to post
Share on other sites

Hi @Gerry thanks for your reply.

We need to be able to extract the data so that we can use Power BI for reporting. Hornbill doesn’t have the functionality to export data automatically so we have to use reports so that we can automatically refresh our data. We use Power BI as we integrate data from other areas to provide a full reporting pack back to the business, so solely using Hornbill’s reports doesn’t work for us.

Using R Script isn’t ideal and we would love to be able to refresh via an API, but we’ve been told that isn’t possible. We need to refresh our data throughout the day every hour (approx.) up to 8 times per day to keep business dashboards accurate.

Hope this helps! :-)

Alec

Share this post


Link to post
Share on other sites

@alecwa

Thanks for the clarification, ok I understand what you are trying to achieve now.  So can I ask, what data are you exporting, are you trying to export all data 8 times a day?  I am trying to establish why you are hitting the 25k record limit?

Gerry

Share this post


Link to post
Share on other sites

We’re exporting the following:

Request ID

Logged Date

Logged By

Category

Request Type

Closed Date

Closed By

Description

Summary

External Ref No

RPN

Customer Code

Company Name

Closure Category

Priority

Source

Customer Department

Services

 

We may also use others sometimes.

We have had to set up reports that have request ID and one other category because we reached the 25000 cell limit so early on. We generate approx. 600 tickets per week.

Share this post


Link to post
Share on other sites

@alecwa

Ok in terms of records, what are you exporting, are you just exporting all requests open and closed? in other words is that export growing by approx 600 records per week?

Gerry 

Share this post


Link to post
Share on other sites

Yes that's correct - we require the historical data to get new/modified Power Bi reports. So we're exporting all, open, closed, on hold, everything, for all time.

Share this post


Link to post
Share on other sites

@alecwa

So that's really the problem we face, there is no viable upper limit on the number of records that would work in your context.  The reporting engine, as convenient as it is for doing what you are doing with it, it's limited, it's not really capable of exporting an ever growing and therefore potentially unlimited amounts of data every hour or so.  What we need to do is how we get what you need to be exported on an incremental basis.  i.e. we export today's data/changes and your backend somehow incrementally expands your local data set. As of today we do not have a tool that can do this, and sadly "just expanding the limit or rows" to solve this way of using the tool is simply not technically viable as its our responsibility to also deliver a good quality service with performance and uptime commitments. 

I think I will need to defer to my devs here and see if we can come up with any possible solution, I am pretty sure though nothing we can do in terms of development will be very quickly delivered, what you need is quite complicated to make if its to scale and not kill the performance of our production databases 

Gerry

 

Share this post


Link to post
Share on other sites

Hi Gerry. Yes I understand the current reporting engine is limited and this is where the frustration comes from. We really need something like an API, or some form of Power Bi plug-in which enables us to query live data.

We've found this current method is the only way to report to the business in near real time, but like we said, it's a pain and very time consuming to maintain.

Share this post


Link to post
Share on other sites

@alecwa

One of our integration engineers will post back and offer some alternative guidance. I would like to point out though that regardless of the system you are using if your strategy is to pull out all data from an ever-growing dataset then there will come a time where performance and reliability are going to become an issue.  It is perfectly feasible to achieve near-real-time reporting in this way but as I say you need to have a better approach and pull data incrementally, and not just pull an ever-growing dataset.  We place the limitation on the reporting engine precisely to catch this sort of use case, it might be a workaround that is accessible but that's the wrong approach if you want something that will be robust in the long term.  We will come up with an alternative approach for you, that will though probably involve using some form of integration/data migration connector/tool.

Thanks

Gerry

Share this post


Link to post
Share on other sites

Just wanted to add we would also be interested in a way to export the data maybe daily from everything created/update the previous day to be used in Power BI dashboards. As Alec has said we also need an expanding dataset but it doesn't have to be realtime for us and can start on a date in the future (if thats possible).

Thanks,

Samuel

Share this post


Link to post
Share on other sites

Hi @alecwa ,

I've had a look to see if providing incremental imports against an existing data set is feasible with Power BI, and it doesn't appear to be. The Power BI incremental import functionality is restricted to SQL Query Imports for Power BI Pro subscribers, and I'm not sure it's possible to script this in M as part of the data source refresh.

One way around this would be for us to provide you with an open source tool that can execute reports in Hornbill and retrieve the output, before upserting the retrieved records into a database that you host & maintain. So you'd execute the tool against a catch-all report (one or more times depending on the size of your requests table) to populate your local database with all request data up until the point of execution, then have the tool run another report on a regular schedule to grab any requests that have been added or updated between executions. This would be a much more efficient approach than importing your ever-expanding request data in its entirety multiple times per day, and would mean you have a near-real-time copy of your request data hosted locally for Power BI to query as and when you need. If I remember correctly, this is how you performed your Power BI reporting against your Supportworks data prior to going live with Hornbill?
    
Would this approach work for you? 

@samwoo Would this approach work for you also?

Let me know your thoughts.

Steve

Share this post


Link to post
Share on other sites

Hi @Steve G,

This does sound feasible, as it gives us control on how we import the data too.

Thanks,

Samuel

Share this post


Link to post
Share on other sites

Hi @samwoo @alecwa,

I released a new tool last night that will run a report in Hornbill, read the CSV output and insert or update the records into a local database table of your choice. Once the data is in there, you can then point your Power BI reports to your database as its data source instead of using the R scripts.

The new tool is documented on the Hornbill wiki, and the release & source code is on Github. It supports the writing of data to MySQL Server v4.1+, MariaDB (all) or SQL Server 2005+.

Once you have a database table set up to hold your request data (I can provide you with a MySQL create table statement for the Service Manager requests table, if it'll help), you can then use this tool to run reports on Hornbill to grab and populate the new table with all the historical data you need. Once you have the bulk of your data in, you can then schedule this tool to run incremental reports to insert new or update existing request records - use the h_datelogged and h_datelastmodified column from the Requests table to identify new or updated records when building your incremental reports.

Let me know if you need any pointers.

Cheers,

Steve

  • Thanks 2

Share this post


Link to post
Share on other sites
4 hours ago, Steve G said:

Hi @samwoo @alecwa,

I released a new tool last night that will run a report in Hornbill, read the CSV output and insert or update the records into a local database table of your choice. Once the data is in there, you can then point your Power BI reports to your database as its data source instead of using the R scripts.

The new tool is documented on the Hornbill wiki, and the release & source code is on Github. It supports the writing of data to MySQL Server v4.1+, MariaDB (all) or SQL Server 2005+.

Once you have a database table set up to hold your request data (I can provide you with a MySQL create table statement for the Service Manager requests table, if it'll help), you can then use this tool to run reports on Hornbill to grab and populate the new table with all the historical data you need. Once you have the bulk of your data in, you can then schedule this tool to run incremental reports to insert new or update existing request records - use the h_datelogged and h_datelastmodified column from the Requests table to identify new or updated records when building your incremental reports.

Let me know if you need any pointers.

Cheers,

Steve

This is great thanks!

Is there an option to just download the CSV into a folder? Another team to use our Hornbill wont have access to Power BI just yet and have asked if reports can be scheduled and saved somewhere in a folder for the moment.

Thanks,

Samuel

Share this post


Link to post
Share on other sites

Hi @samwoo

There's already a tool that runs Hornbill reports and dumps the files to a network share or folder of your choice :)

It's up on GitHub, and documented on the wiki.

Cheers,
Steve

 

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×