Jump to content

Error when creating Input for On-Demand Information Link


Kseniia Kurianova 2

Recommended Posts

I have a Column with String values in a Table, there are 1000 rows, max string length - 30 symbols . This column is used as Input Parameter for On Demand Data that is retrieved via Information Link.
When I try to refresh On-Demand data, I get the following Error:
 

Error message: Could not reload data.

TargetInvocationException at Spotfire.Dxp.Framework:
Exception has been thrown by the target of an invocation. (HRESULT: 80131604)

Stack Trace:
   at Spotfire.Dxp.Framework.ApplicationModel.ProgressService.ExecuteWithProgressCore(String title, String description, ProgressOperation operation)
   at Spotfire.Dxp.Forms.FormsDataCanvasWebViewImpl.HandleRefreshProducer(DataTable table, DataColumnProducer producer)


ImportException at Spotfire.Dxp.Data:
Failed to execute data query: An Internal Error has occurred. Cannot read attachment 3f002fdf-43d1-4798-a07e-2a49782d2211 (HRESULT: 80131500)

Stack Trace:
   at Spotfire.Dxp.Data.Producers.ExternalColumnProducer.<RefreshSynchronously>d__69.MoveNext()
   at Spotfire.Dxp.Web.DataCanvasWebViewImpl.<RefreshProducer>d__20.MoveNext()
   at Spotfire.Dxp.Forms.FormsDataCanvasWebViewImpl.<>c__DisplayClass4_0.<HandleRefreshProducer>b__0()
   at Spotfire.Dxp.Framework.ApplicationModel.Progress.<>c__DisplayClass22_0.<Start>b__0()
   at Spotfire.Dxp.Framework.ApplicationModel.MonitorableProgress.Start[T](Func`1 action)
   at Spotfire.Dxp.Forms.Application.FormsProgressService.ProgressThread.<DoOperationLoop>b__11_0()


ImportException at Spotfire.Dxp.Data:
Failed to execute data query: An Internal Error has occurred. Cannot read attachment 3f002fdf-43d1-4798-a07e-2a49782d2211 (HRESULT: 80131500)

Stack Trace:
   at Spotfire.Dxp.Data.DataFunctions.DataFunctionImpl.HandleExceptionDuringInvocation(DataFunctionWorkItemState workItemState, Boolean throwOnException, Boolean notifyException)
   at Spotfire.Dxp.Data.DataFunctions.DataFunctionImpl.ExecuteSynchronously(Boolean throwOnException)
   at Spotfire.Dxp.Data.Producers.ExternalColumnProducer.<>c__DisplayClass69_0.<<RefreshSynchronously>g__RefreshImpl|1>d.MoveNext()
   at Spotfire.Dxp.Data.Producers.ExternalColumnProducer.<RefreshSynchronously>d__69.MoveNext()


InformationModelException at Spotfire.Dxp.Data:
An Internal Error has occurred. Cannot read attachment 3f002fdf-43d1-4798-a07e-2a49782d2211 (HRESULT: 80131500)

Stack Trace:
   at Spotfire.Dxp.Data.InformationModel.InternalInformationModelManager.OpenJob(InformationLinkElement informationLink, IList`1 parameters, Boolean isReferenced)
   at Spotfire.Dxp.Data.InformationModel.InternalInformationModelManager.<>c__DisplayClass67_0.<ExecuteInformationLink>b__0()
   at Spotfire.Dxp.Data.InformationModel.InternalInformationModelManager.ExecuteInformationLink(InformationLinkElement informationLink, IList`1 parameters, Boolean isReferenced, Boolean allowRetry)
   at Spotfire.Dxp.Data.Import.InformationLinkDataSourceConnection.ExecuteInformationLink()


InformationModelServiceException at Spotfire.Dxp.Services:
An Internal Error has occurred. Cannot read attachment 3f002fdf-43d1-4798-a07e-2a49782d2211 (HRESULT: 80131509)

Stack Trace:
   at Spotfire.Dxp.Services.WcfSoapService`2.InvokeService[T](Func`1 serviceMethod, String customMethodNameForLogging)
   at Spotfire.Dxp.Data.InformationModel.InternalInformationModelManager.OpenJob(InformationLinkElement informationLink, IList`1 parameters, Boolean isReferenced)


However, If I split data in the input Column in two - 505 and 505 rows, each half works fine as input parameter and data loads with no errors.
Also if I crop the strings and leave only 22 symbols ( using left([Input Column],22)  ) - also works fine.

What can be the issue with the full dataset and how to fix it ?

Link to comment
Share on other sites


According to your description, this looks like a problem with too many values in the IN clause of the generated SQL which usually like this 👇

SELECT ...
FROM ...
WHERE 
Column in ('a','b','c',......'some value')

 

1- You can adapt your data source template to reduce the  "max-in-clause-size" to a lower value. As a result, Spotfire will split into two clauses like this 👇

SELECT ...
FROM ...
WHERE 
Column in ('a','b','c',......'some value abc')
or 
Column in ('x','y','z',......'some value xyz')

2- if your database allows creating temporary tables you can also add the temporary table statements so that Spotfire will create a temporary table instead of using an IN clause. In this case, you will change the  "condition-list-threshold" to a lower value as well

1-
CREATE TEMPORARY TABLE TEMP_XYX ....
INSERT INTO TEMP_XYX VALUES .....
--------

2-
SELECT ...
FROM ...
WHERE 
Column in (SELECT Column from TEMP_XYX)

3-
----
DROP TEMPORARY TABLE TEMP_XYX

 

Below is an example of a data source template having those settings

1-
  <max-in-clause-size>1000</max-in-clause-size>
  <condition-list-threshold>10000</condition-list-threshold>
  <expand-in-clause>false</expand-in-clause>
  <table-expression-pattern>[$$catalog$$.][$$schema$$.]$$table$$</table-expression-pattern>
  <date-format-expression>$$value$$</date-format-expression>
  <time-format-expression>convert(varchar(8), $$value$$, 108)</time-format-expression>
  <date-time-format-expression>$$value$$</date-time-format-expression>

2-
 <temp-table-name-pattern>#$$name$$</temp-table-name-pattern>
 <create-temp-table-command>CREATE TABLE $$name$$ $$column_list$$</create-temp-table-command>
 <drop-temp-table-command>DROP TABLE $$name$$</drop-temp-table-command>

 

Let me know if this helps

Link to comment
Share on other sites

Hi Kseniia,

You probably see nothing since the query is failing and no output is being logged.

The initial column with 1000 rows, where does it come from. Is that from a data table also being connected to an Information Link? Or direct query?
And what are the data sources being used?


Could you also set your Analyst logging level to DEBUG (if not already done) and see what Spotfire.Dxp.SupportDiagnostics.log file is mentioning about this issue when you reproduce the error? And if so, share that log file?

image.png.efda570a7902c8c3ef40dd8c5dbda11c.png

Kind regards,

David

Link to comment
Share on other sites

The initial column with 1000 rows - I tried different options, initially it also came from Information Link, but I tried creating the table from Excel and creating the table from Copying Data from Clipboard, results are the same in every case.

I have Debug and Detailed Error messages checked, but don't see much more information

Spotfire.Dxp.SupportDiagnostics.log

Link to comment
Share on other sites

Thanks Kseniia.

I'll check the logs and see if I can come up with a similar set up on my end (hoping to reproduce the same error).

Could you also tell us what the underlying data sources/data bases are (what the information link should be loading the on demand data from)?

Kind regards,

David

Edited by David Boot-Olazabal
Link to comment
Share on other sites

Hi Kseniia,

I just took a quick peak into the log file.

Two things were popping up. The error message you already mentioned as well as a (possible) authentication issue.
I looked up the error message on our support site and found some very old references.
Could you tell us on which version you are and if all the latest hotfixes have been deployed? Because in these old posts, references were made to installing/deploying the latest (client) hotfix for your version of Spotfire.

Concerning the authentication issue, this is the last part of the logfile (I have tried to mask as many references to your environment as possible, such as the actual server name, your id, etc.):


2024-09-10T12:23:54,935+03:00 2024-09-10 09:23:54,935 s1102495 [Main Thread] WARN  Spotfire.Dxp.Services.Http.HttpClient [(null)] - ExecuteNonSpotfireServerRequestWithRetry(https://<spotfire server name>/spotfire/oauth2/token, RequestId , User OAuth2Identity: Auth '<id>...', Refresh '<id2>...' (False), Scope 'internal.client-login', SpotfireIdentity: Kurianova Kseniia (ext) CHST (<sID>), IsAuthenticated True, time of authentication: 9/9/2024 12:23:48 PM, , en-US, en-US, <spotfire server name>::JSESSIONID=<session id>= on <spotfire server name>/spotfire, sec=True, http=True,  expire=1/1/0001 12:00:00 AM, XSRF-TOKEN=<xsrf-token id> on <spotfire server name>/, sec=True, http=False,  expire=1/1/0001 12:00:00 AM):
 Failed to make request, giving up. Status: ProtocolError, HResult: -2146233079, Response: ---.
 SecurityProtocol Tls, Tls11, Tls12, DefaultConnectionLimit 8, MaxServicePointIdleTime 50000 ServerCertificateValidationCallback Spotfire.Dxp.Services.Http.HttpClient.RemoteCertificateValidationCallback
System.Net.WebException: The remote server returned an error: (400) Bad Request.
   at System.Net.HttpWebRequest.GetResponse()
   at Spotfire.Dxp.Services.Http.SpotfireRequest.ExecuteNonSpotfireServerRequestWithRetry(Func`1 requestCreator, RetryOptions retryOptions).

I'm not sure if this is related, but it comes up right after the error message about the failing on-demand query (multiple times in fact).
Does the connection to VerticaDB requires an authentication that is missed by the on-demand query perhaps?

Kind regards,

David

 

Link to comment
Share on other sites

We are using Spotfire  14.0.3 
No, Vertica does not require any additional authentication - the credentials are in the Data Source (in Spotfire Information Designer). And if I in this same report , for the same On-Demand table change Input Parameter from the problematic column to a column with half of these values, it refreshes fine in the same session. Also If I create a calculated column with only 22 symbols from the original values (I used both Left and Right), it also works fine as input parameter and refreshes. 

Link to comment
Share on other sites

Hi Kseniia,

Thank you for the feedback.
So, the issue is not so much with the number of values in the 'IN' clause, but more with the content of the values.

I can confirm that the number of values in the 'IN' clause, will work fine in on-demand queries when larger than 1000.

And as you mentions, it seems like the content (or length) is the culprit. Are there specific characters in the used column values, that could cause this behavior?
Could you see what the max length of the values are and try to put a Left/Right statement around the column with the max number of characters to see if that works? I'm thinking that maybe leading or ending spaces may have an impact here.

Kind regards,

David

Link to comment
Share on other sites

Hi David!

We also thought that the issue might be with one of the input values containing some unusual characters , but the strange thing is that  If I split the input column in half and use each half separately (505 and 505 values), each half works fine with no errors (values are the same).  I also tried 900 first and 900 last values, both work fine. 
The values do contain symbols like "\", "|", even ";" however when they are part of 900 values, they all work fine.

The max length is 65. 
I tried 

case  
  when Len([INPUT COLUMN])>=50 then left([INPUT COLUMN],40)
  else [INPUT COLUMN]
end 


That did not help.
Also as I said before if I trim and leave only 22 symbols in each column it works (symbols like | \ ; still remain), but if I leave 23 it breaks. 

Link to comment
Share on other sites

Hi Kseniia,

Thank you for the input file. We think this issue needs deeper investigation. Therefore we would ask you to open a support ticket via https://support.tibco.com.
Our colleagues from support can then view the monitor logs to deep dive into this particular issue. You may share the ticket number with us, so we can trace it as well.

To have a clear view on who's impacted, is that just you, or does this have an impact on other users as well, since you can't proceed with your analysis file?

Kind regards,

David

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...