I want to read the image stored in Oracle Long datatype. Number of images are stored in a remote Oracle database in a column with datatype long. I just need to retrieve those images and show them on my aspx page. I could retrieve the image from database but when tried to caste it to byte array, it threw error that, string can not be converted to byte[]'. Anybody have any suggestions on how to retrieve these images stored in long column in the database. I have already tried various beginner techniques.
OracleDataReader TableReader = null;
string connectionString = ConfigurationManager.ConnectionStrings["System.Data.OracleClient"].ConnectionString;
using (OracleConnection connection = new OracleConnection(connectionString))
{
connection.Open();
command = new OracleCommand("Select * From IMAGE_TABLE where CUST_ID ='xxxxxxxx' fetch first 1 rows only", connection);
command.InitialLONGFetchSize = -1;
TableReader = command.ExecuteReader();
while (TableReader.Read())
{
try
{
var tt = Convert.ToInt32(TableReader[14]);
byte[] bytes = new byte[tt];
//long bytesRead = TableReader.GetBytes(13, 0, bytes, 0, tt);
//FileStream file = new FileStream("d:\\Img1.jpg", FileMode.Create, FileAccess.Write);
//file.Write(bytes, 0, (int)bytesRead);
//file.Close();
byte[] blob_ =System.Text.UTF32Encoding.ASCII.GetBytes(TableReader[13].ToString());
using (MemoryStream ms = new MemoryStream(blob_))
{
image = Image.FromStream(ms);
}
}
catch (Exception rr)
{
}
}
}
Related
private void Msg1_LinkClicked(object sender, LinkLabelLinkClickedEventArgs e)
{
string SaveFileTo = "D:\\DA.jpg";
// string SaveFileTo = Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments);
// string filename = "jmk.jpg";
// SaveFileTo = Path.Combine(SaveFileTo, filename);
CN.Open();
SqlCommand cmd = new SqlCommand("select photo from edistindata where enrollno='" + CboEnroll.Text + "' ", CN);
DR = cmd.ExecuteReader();
byte[] data = null;
while (DR.Read())
{
data = (byte[])DR["Photo"];
}
using (var fs = new FileStream(SaveFileTo, FileMode.Create, FileAccess.Write))
{
fs.Write(data,0,data.Length);
}
MessageBox.Show("Success");
}
I have run the code in the question, even it has some of not good coding practices, if the data in the photo column is not corrupted it should produce the image file.
so I have put down your code with proper enhancements that will flag you early before saving corrupted data.
private void Msg1_LinkClicked(object sender, LinkLabelLinkClickedEventArgs e)
{
ImageCodecInfo theJpgCodecInfo = ImageCodecInfo.GetImageEncoders()
.FirstOrDefault(encoder => encoder.MimeType == "image/jpeg");
var myEncoder = Encoder.Quality;
var myEncoderPrams = new EncoderParameters(1);
myEncoderPrams.Param[0] = new EncoderParameter(myEncoder, 100L);
string SaveFileTo = "D:\\DA";
// string SaveFileTo = Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments);
// string filename = "jmk.jpg";
// SaveFileTo = Path.Combine(SaveFileTo, filename);
CN.Open();
SqlCommand cmd = new SqlCommand("select photo from edistindata where
enrollno=#Enrol;", CN);
cmd.Parameters.AddWithValue("#Enrol", CboEnroll.SelectedItem.ToString());
DR = cmd.ExecuteReader();
byte[] data = null;
int idx = 0;
while (DR.Read())
{
data = (byte[])DR["Photo"];
Image tempImage;
try
{
tempImage = ByteArrayToImage(data);
}
catch (Exception exc)
{
MessageBox.Show(exc.GetType().ToString()
+Environment.NewLine+exc.Message);
continue;
}
var tempBitmap = new Bitmap(tempImage);
var tempFname = "D:\\DA"+(idx++).ToString()+".jpg";
tempBitmap.Save(tempFname, theJpgCodecInfo, myEncoderPrams);
}
}
public static Image ByteArrayToImage(byte[] byteArrayIn)
{
using (MemoryStream ms = new MemoryStream(byteArrayIn))
{
Image returnImage = Image.FromStream(ms);
return returnImage;
}
}
The enhancements that I suggested as follow:
As #derpirscher mentioned in his comment use parameters not string concatenation when building SQL commands and the easiest most lazy way to do it is Parameters.AddWithValue().
Use CboBox.SelectedItem.ToString in place of the .Text as been in your code.
instead of dumping the byte[] directly through a FileStream.Write() I transferred the byte array into Image() then into Bitmap().
this way if the byte[] (and hence the data in the db column photo) is not one of the binary formats the the class Image() can handle it will throw exception and I used that to show the related error.
Also, this way we over came some issues that may rise if we used Bitmap() directly if the data byte[] contains PNG format.
No matter what image format is stored in the photo column it will be legitimately re-encoded as Jpeg then saved.
i have DataSet that contain Image.
i need to save this Image to File.
i try this:
SQL = ItemCode,PIC from ROW;
dsView = new DataSet();
adp = new SqlCeDataAdapter(SQL, Conn);
adp.Fill(dsView, "ROW");
adp.Dispose();
foreach (DataRow R in dsROW.Tables[0].Rows)
{
ItemCode = R["ItemCode"].ToString().Trim() ;
TEMP = R["PIC"].ToString().Trim();
Image image = R["PIC"] as Image;
if(image != null)
{
MemoryStream ms = new MemoryStream();
image.Save(ms, System.Drawing.Imaging.ImageFormat.Jpeg);
byte[] imagedata = ms.ToArray();
}
but image Always is null
and in TEMP i see System.Byte[]
need some help, thanks
Your R["PIC"] is an array or Bytes.
First, you try to apply ToString() to it, and you simply get System.Byte[].
Then, you try to cast it to Image. How is Byte[] supposed to cast to Image?
You need to create an Image from your Byte array:
dsView = new DataSet();
adp = new SqlCeDataAdapter(SQL, Conn);
adp.Fill(dsView, "ROW");
adp.Dispose();
foreach (DataRow R in dsROW.Tables[0].Rows)
{
ItemCode = R["ItemCode"].ToString().Trim();
using (var ms = new MemoryStream(R["PIC"]))
{
Image image = Image.FromStream(ms);
image.Save($"C:\\Output\\YourCustomPath\\{ItemCode}.jpeg", ImageFormat.Jpeg);
}
}
You need to read the data in sequential access mode. Here is an example I use to read binary data. See the use of
SqlDataReader rdr = cmd.ExecuteReader(CommandBehavior.SequentialAccess);
snippet:
string sql = "Select file_data from utils where util_id="+uID.ToString()+";";
SqlCommand cmd = new SqlCommand(sql, database._sqlConnection);
SqlDataReader rdr = cmd.ExecuteReader(CommandBehavior.SequentialAccess); // CommandBehavior.SequentialAccess: read columns in order and every column only once!
int columnNr = 0;
byte[] filedata = null;
try
{
while (rdr.Read())
{
//load the binary data
if (rdr.IsDBNull(columnNr)) //is there any binary data? //LAST COLUMN!!!!!
filedata = null;
else
{
//read binary data
int bufferSize = 100; // Size of the BLOB buffer.
byte[] outbyte = new byte[bufferSize]; // The BLOB byte[] buffer to be filled by GetBytes.
long retval; // The bytes returned from GetBytes.
long startIndex = 0; // The starting position in the BLOB output.
MemoryStream ms = new MemoryStream();
BinaryWriter bw = new BinaryWriter(ms);
// Reset the starting byte for the new BLOB.
startIndex = 0;
// Read the bytes into outbyte[] and retain the number of bytes returned.
retval = rdr.GetBytes(columnNr, startIndex, outbyte, 0, bufferSize);
// Continue reading and writing while there are bytes beyond the size of the buffer.
while (retval >0) //== bufferSize)
{
bw.Write(outbyte);
bw.Flush();
// Reposition the start index to the end of the last buffer and fill the buffer.
startIndex += bufferSize;
retval = rdr.GetBytes(columnNr, startIndex, outbyte, 0, bufferSize);
}
bw.Close();
filedata = ms.ToArray();
ms.Close();
}
}
}
catch (SqlException ex)
I sending file to SqlFileStream in parts. Like this:
public void StreamFile(int storageId, Stream stream)
{
var select = string.Format(
#"Select TOP(1) Content.PathName(),
GET_FILESTREAM_TRANSACTION_CONTEXT() FROM FileStorage WHERE FileStorageID={0}",
storageId);
using (var conn = new SqlConnection(this.ConnectionString))
{
conn.Open();
var sqlTransaction = conn.BeginTransaction(IsolationLevel.ReadCommitted);
string serverPath;
byte[] serverTxn;
using (var cmd = new SqlCommand(select, conn))
{
cmd.Transaction = sqlTransaction;
using (SqlDataReader rdr = cmd.ExecuteReader())
{
rdr.Read();
serverPath = rdr.GetSqlString(0).Value;
serverTxn = rdr.GetSqlBinary(1).Value;
rdr.Close();
}
}
this.SaveFile(stream, serverPath, serverTxn);
sqlTransaction.Commit();
}
}
private void SaveFile(Stream clientStream, string serverPath, byte[] serverTxn)
{
const int BlockSize = 512;
using (var dest = new SqlFileStream(serverPath, serverTxn,
FileAccess.ReadWrite, FileOptions.SequentialScan, 0))
{
var buffer = new byte[BlockSize];
int bytesRead;
dest.Seek(dest.Length, SeekOrigin.Begin);
while ((bytesRead = clientStream.Read(buffer, 0, buffer.Length)) > 0)
{
dest.Write(buffer, 0, bytesRead);
}
}
clientStream.Close();
}
I tryed upload file in 10 parts. When I look up folder where is hold data I saw 10 files. For each additional parts SqlFilesStream creates a new file. The last file holding all data but rest file unnecessarily wasted space. Is possible holding all data in one file - last append operation?
SqlFileStream doesn't support "partial updates". Each time the content changes, a new file is written to disk. These extra files are de-referenced and will eventually be cleaned up, but until they are, they will affect your backups/logs/etc
If you want all the data in a single file, you will need to process the entire file in a single stream.
If you don't need to support large files ( > 2 GB) and you expect lots of small partial updates, you might be better off storing the file in a VARBINARY(MAX) column instead.
I have stored an excel file in my SQL Server 2008 DB (as a VARBINARY(MAX)) the following (so far, hard coded) way:
// C# - visual studio 2008
var update = new SqlCommand("UPDATE Requests SET Attachment = #xls" +
" WHERE RequestsID = 27", conn);
update.Parameters.AddWithValue("xls", File.ReadAllBytes("C:/aFolder/hello.xlsx"));
update.ExecuteNonQuery();
It works, but I want to open it too!
How do I do that?
Note, not read blob-data, but open the actual "hello.xlsx" file.
I have tried the following:
http://dotnetsoldier.blogspot.com/2007/07/how-to-retrieve-blob-object-in-winforms.html
I can see it works, as "Binary.Length" is exactly the size of my "hello.xlsx" when executing - but the file doesn´t open, and that´s my problem.
Please help me out!
EDIT:
HERE IS THE CODE THAT I CURRENTLY USE TO "OPEN" THE SPREADSHEET:
SqlConnection conn =
new SqlConnection
(global::MY_PROJECT.Properties.Settings.Default.DB_1ConnectionString);
conn.Open();
SqlCommand Cmd = new SqlCommand("select Attachment from Requests where RequestsID = 27", conn);
Cmd.CommandType = CommandType.Text;
SqlDataReader Reader = Cmd.ExecuteReader(CommandBehavior.CloseConnection);
//
string DocumentName = null;
FileStream FStream = null;
BinaryWriter BWriter = null;
//
//
//
byte[] Binary = null;
const int ChunkSize = 100;
int SizeToWrite = 0;
MemoryStream MStream = null;
//
while (Reader.Read())
{
DocumentName = Reader["Attachment"].ToString();
// Create a file to hold the output.
FStream = new FileStream(#"c:\" + DocumentName, FileMode.OpenOrCreate, FileAccess.Write);
BWriter = new BinaryWriter(FStream);
Binary = (Reader["Attachment"]) as byte[];
SizeToWrite = ChunkSize;
MStream = new MemoryStream(Binary);
//
for (int i = 0; i < Binary.GetUpperBound(0) - 1; i = i + ChunkSize)
{
if (i + ChunkSize >= Binary.Length) SizeToWrite = Binary.Length - i;
byte[] Chunk = new byte[SizeToWrite];
MStream.Read(Chunk, 0, SizeToWrite);
BWriter.Write(Chunk);
BWriter.Flush();
}
BWriter.Close();
FStream.Close();
}
FStream.Dispose();
conn.Close();
The code you posted looks like it probably writes the spreadsheet to disc but I can't see any code to open it. You would need to use something like
System.Diagnostics.Process.Start(#"c:\" + DocumentName)
I think.
I am trying to read a BLOB from an Oracle database. The function GetFileContent take p_file_id as a paramater and return a BLOB. The BLOB is a DOCX-file that needs to be written in a folder somewhere. But I can't quite figure out how to read the BLOB. There is definitely something stored in the return_value-paramater after
OracleDataReader reader = cmd.ExecuteReader(CommandBehavior.SequentialAccess);
The value is {byte[9946]}. But I get an error when executing
long retrievedBytes = reader.GetBytes(1, startIndex, buffer, 0, ChunkSize);
It says InvalidOperationException was caught: "No data exists for the row or column."
Here is the code:
cmd = new OracleCommand("GetFileContent", oraCon);
cmd.CommandType = CommandType.StoredProcedure;
cmd.Parameters.Add("p_file_id", OracleType.Number).Direction = ParameterDirection.Input;
cmd.Parameters[0].Value = fileID;
cmd.Parameters.Add("return_value", OracleType.Blob).Direction = ParameterDirection.ReturnValue;
cmd.Connection.Open();
OracleDataReader reader = cmd.ExecuteReader(CommandBehavior.SequentialAccess);
reader.Read();
MemoryStream memory = new MemoryStream();
long startIndex = 0;
const int ChunkSize = 256;
while (true)
{
byte[] buffer = new byte[ChunkSize];
long retrievedBytes = reader.GetBytes(1, startIndex, buffer, 0, ChunkSize); //FAILS
memory.Write(buffer, 0, (int)retrievedBytes);
startIndex += retrievedBytes;
if (retrievedBytes != ChunkSize)
break;
}
cmd.Connection.Close();
byte[] data = memory.ToArray();
memory.Dispose();
How can I read the BLOB from the function?
Looks like you are using the Microsoft Oracle Client. You probably want to use the LOB objects rather than using GetBytes(...).
I think the first link below would be the easiest for you. Here is an excerpt:
using(reader)
{
//Obtain the first row of data.
reader.Read();
//Obtain the LOBs (all 3 varieties).
OracleLob BLOB = reader.GetOracleLob(1);
...
//Example - Reading binary data (in chunks).
byte[] buffer = new byte[100];
while((actual = BLOB.Read(buffer, 0, buffer.Length)) >0)
Console.WriteLine(BLOB.LobType + ".Read(" + buffer + ", " + buffer.Length + ") => " + actual);
...
}
OracleLob::Read Method
OracleLob Class
OracleDataReader::GetOracleLob Method
On a side note, the Microsoft Oracle client is being depreciated. You may want look into switching to Oracle's ODP.net, as that will be the only "Officially Supported" client moving forward.