Where I work, we have two systems that use SAP, one using Delphi and another using c#. I'm implementing the c# and both have the same problem, when I query for a great amount of columns using RFC_READ_TABLE, depending on the table ( usually 60+ ), it returns a Rfcabapexception with no description and no Inner Exception, just a title. What is causing this exception and what can I do to prevent it?
The function module RFC_READ_TABLE has to convert the data to a generic format because "really generic types" like DATA or STANDARD TABLE are not supported for RFC communication. Because of this, the outout is transmitted as a series of table lines, each a character field up to 512 characters in length.
This has several consequences:
If the total size of all fields you requested exceeds 512 characters, you will get a short dump (check with transaction ST22) and the exception you mentioned.
If you try to read fields that can not be converted to character fields and/or do hot have a fixed-length (!) character representation, bad things will happen. Most likely, RFC_READ_TABLE will either abort with a short dump or barf all over your output data.
You can bypass the first problem by slicing the table vertically and reading groups of columns sequentially. Be aware that RFC_READ_TABLE is not guaranteed to always return the data in the same order when stitching the results back together again. Also be aware that you might run into violations of transaction isolation, depending on how often the data you read changes.
Related
I'm currently using SSIS to do an improvement on a project. need to insert single documents in a MongoDB collection of type Time Series. At some point I want to retrieve rows of data after going through a C# transformation script. I did this:
foreach (BsonDocument bson in listBson)
{
OutputBuffer.AddRow();
OutputBuffer.DatalineX = (string) bson.GetValue("data");
}
But this piece of code that works great with small file does not work with a 6 million line file. That is, there are no lines in the output. The other following tasks validate but react as if they had received nothing as input.
Where could the problem come from?
Your OuputBuffer has DatalineX defined as a string, either DT_STR or DT_WSTR and a specific length. When you exceed that value, things go bad. In normal strings, you'd have a maximum length of 8k or 4k respectively.
Neither of which are useful for your use case of at least 6M characters. To handle that, you'll need to change your data type to DT_TEXT/DT_NTEXT Those data types do not require a length as they are "max" types. There are lots of things to be aware of when using the LOB types.
Performance can suck depending on whether SSIS can keep the data in memory (good) or has to write intermediate values to disk (bad)
You can't readily manipulate them in a data flow
You'll use a different syntax in a Script Component to work with them
e.g.
// TODO: convert to bytes
Output0Buffer.DatalineX.AddBlobData(bytes);
Longer example of questionable accuracy with regard to encoding the bytes that you get to solve at https://stackoverflow.com/a/74902194/181965
I have to make a method that exports database data to a file. The software that will process this file requires specific string length - no more, no less. The data that comes in will have less than the required number of characters, but I have to make the length of every column fit that number exactly by addind whitespaces.
Does anyone have an idea how to do this in the most efficient way? The datasets won't be small and the only thing that comes to mind is looping over every table cell, checking the length and then adding up to the required length...
SCENARIO: Given 2 input strings I need to find minimum number of insertions deletions and substitutions required to convert one string to other. The strings are text from 2 files. The comparison has to be done at word level.
What i have done is implemented edit distance algorithm which does the job well using a 2-dimensional array of size (m*n) where sizes of input strings are m and n.
The PROBLEM i am facing is if the value of m and n becomes large, say more than 16,000 i am getting OutOfMemory exception due to the large size of m*n array. Also i am running into memory fragmentation and LargeObjectHeap issues
QUESTION Looking for a C# code to solve edit distance problem for 2 very large sized strings (each containing more than 20k words) without getting OutOfMemory exception.
MapReduce or DataBase or MemoryMappedFile related solutions not feasible. Only a pure C# code will work.
Hey Everyone, I am writing some code that makes use of SQL Server CE 3.5 and I am having a very strange problem. I have a string field in one of the tables that needs to store a full file path.
Over the course of trying to fix this problem I have that field set as nvarchar with a max size of 4000, but it is still cutting longer strings that are much shorter than the limit off
for example:
D:\iTunes\iTunes Media\Music\Abigail Williams\In The Absence Of Light\02 Final Destiny Of The Gods.m
This is clearly smaller than 4000 characters, yet it is missing the p3 at the end of the string.
I am using a table adapter to enter the data into the database with the following query:
INSERT INTO [Track] ([Artist_ID], [Album_ID], [FilePath], [LastUpdate])
VALUES (#Art, #Al, #Fp, #LU)
I know that the strings are fully formed on insert because I am using the following code to check:
if(!temp.Filepath.EndsWith(".mp3"))
MessageBox.Show("File Error");
this.trackTableAdapter1.InsertQuery(ArtID, AlID, temp.Filepath, File.GetLastWriteTime(temp.Filepath));
The message box does not get shown, so the string must end correctly on insert.
the query that extracts the data is:
SELECT
*
FROM Track
WHERE Artist_ID=#Artist_ID AND Album_ID=#Album_ID
The involved code is:
foreach (Database.MusicDBDataSet.TrackRow TR in this.trackTableAdapter1.GetAlbumTracks(AR.Artist_ID, AlR.Album_ID).Rows)
{
//if (!TR.FilePath.EndsWith(".mp3"))
//MessageBox.Show("File Path Error");
this.ArtistList[AR.Name].AlbumList[this.ArtistList[AR.Name].AlbumList.Count - 1].TrackList.Add(new Track(TR.FilePath, AlR.Name, AR.Name));
}
Has anyone ever run into this problem before?
Check the XSD file. Specifically, check the FilePath column of your table and look for the max length.
Maybe take a look at the SQLServerCE Parameter Size limitation.
What is the specific maximum length? Is it around 100 chars? (Guessing based on your provided input example).
The 100 unicode chars also matches with D.K. Mulligan's answer. Looking at SQL ServerCE Paramater Size Property
For variable-length data types, the Size property describes the maximum amount of data to send to the server. For example, the Size property can be used to limit the amount of data sent to the server for a string value to the first 100 bytes.
For Unicode string data, the Size property refers to the number of characters. The count for strings does not include the terminating character.
Try bumping the size to see if this is the magic number that is truncating your strings.
i am trying to store large data more than 255 characters in a string datatype but it truncates after 255. how can i achive this basically i need to pass this data to database
C# strings do not have any particular character limit. However the database column you are writing to may have a limit. If you are storing large amounts of data, you should use a BLOB column instead of an ordinary varchar type.
StringBuilder class
Like they said the string class is not limited, but you can do this for large strings. I feel it handles them better.
StringBuilder sb = new StringBuilder();
sb.append("Some text...");
sb.append("more text...");
sb.append("even more text!");
sb.toString();
Okay, it sounds like you have several different technologies involved - Excel, XML, databases etc. Try to tackle just one at a time. First read the data out of Excel, and make sure you can do that without any truncation.
Write a small console app which will read the value, then write it to the console - and its length. If that works, you know the problem isn't in Excel.
Next you can write a small console app with hardcoded input data (so you don't need to keep using interop with Excel) and write the XML from that, or whatever your next stage is.
Basically, take the one big problem ("when I read data from Excel and write it to the database it truncates long values") and split it into smaller and smaller ones until you've found what's wrong.
The string type does not limit strings to 255 characters. Your database column must be 255 characters.
I know that c# strings can hold much longer data than that. If the truncation occurs on commiting to DB, check the length constraint on ur Db field
The problem lies in the Excel part; .Character has a 255 characters limitation.
To read the complete text from a shape the following VBA syntax would do:
Worksheets("YourSheet").Shapes("Shape1").OLEFormat.Object.Text