I have database with EmpNo(Int) and EmpImage(Image) columns.
I am using HttpHandler to display the Images.
I am storing Images both in database and folder.
Now I want to change the names of Images in folder as names of EmpNo whose I didn't change while uploading.
So need to fetch the Images names from database to compare them with the Image names in the folder and rename them.
How can i fetch or extract the image names from the binary data that i get from database using generic handler.
I have attached the code In handler for reference.
using System;
using System.Web;
using System.Data;
using System.Data.SqlClient;
public class Lab_14___ImageFetchingHandler : IHttpHandler
{
public void ProcessRequest(HttpContext context)
{
SqlConnection vConn = new SqlConnection("server=localhost; database=Asp.netDemoWebsiteDatabase; Integrated Security = SSPI;");
vConn.Open();
String vQuery = "Select EmpImage from EmpImages where Empno=#id";
SqlCommand vComm = new SqlCommand(vQuery, vConn);
//Receive the Id from some Form
String vId = context.Request.QueryString["id"];
vComm.Parameters.AddWithValue("#id", vId);
SqlDataReader vDr = vComm.ExecuteReader();
while (vDr.Read())
{
context.Response.ContentType = "image/jpg";
context.Response.BinaryWrite((byte[])vDr["EmpImage"]);
[ Here I need the Images names to store in List or array.How?? ]
}
vConn.Close();
}
public bool IsReusable
{
get
{
return false;
}
}
}
Here are different ways to inspect image metadata.
Byte[] content = (Byte[])vDr["EmpImage"]
//Option 1
Image img = new Bitmap(new MemoryStream(content));
Encoding _Encoding = Encoding.UTF8;
var props = img.PropertyItems;
string propData;
foreach (var propertyItem in props)
{
propData = _Encoding.GetString(propertyItem.Value);
Debug.WriteLine("{0}[{1}]", propertyItem.Id, propData);
}
//option 2 - require reference of PresentationCore and WindowsBase and then using System.Windows.Media.Imaging
var imgFrame = BitmapFrame.Create(new MemoryStream(content));
var metadata = imgFrame.Metadata as BitmapMetadata;
//option 3 - require MetadataExtractor Nuget package
var mr = ImageMetadataReader.ReadMetadata(new MemoryStream(content));
foreach (var directory in mr)
{
foreach (var tag in directory.Tags)
{
Debug.WriteLine("{0} - {1} = {2}]", directory.Name, tag.Name, tag.Description);
}
}
Related
I have a BIM model in IFC format and I want to add a new property, say cost, to every object in the model using Xbim. I am building a .NET application. The following code works well, except, the property is also added to storeys, buildings and sites - and I only want to add it to the lowest-level objects that nest no other objects.
To begin with, I have tried various methods to print the "related objects" of each object, thinking that I could filter out any objects with non-null related objects. This has led me to look at this:
IfcRelDefinesByType.RelatedObjects (http://docs.xbim.net/XbimDocs/html/7fb93e55-dcf7-f6da-0e08-f8b5a70accf2.htm) from thinking that RelatedObjects (https://standards.buildingsmart.org/IFC/RELEASE/IFC2x3/FINAL/HTML/ifckernel/lexical/ifcreldecomposes.htm) would contain this information.
But I have not managed to implement working code from this documentation.
Here is my code:
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using System.Windows.Forms;
using Xbim.Ifc;
using Xbim.Ifc2x3.Interfaces;
using Xbim.Ifc4.Kernel;
using Xbim.Ifc4.MeasureResource;
using Xbim.Ifc4.PropertyResource;
using Xbim.Ifc4.Interfaces;
using IIfcProject = Xbim.Ifc4.Interfaces.IIfcProject;
namespace MyPlugin0._1
{
public partial class Form1 : Form
{
public Form1()
{
InitializeComponent();
outputBox.AppendText("Plugin launched successfully");
}
private void button1_Click(object sender, EventArgs e)
{
// Setup the editor
var editor = new XbimEditorCredentials
{
ApplicationDevelopersName = "O",
ApplicationFullName = "MyPlugin",
ApplicationIdentifier = "99990100",
ApplicationVersion = "0.1",
EditorsFamilyName = "B",
EditorsGivenName = "O",
EditorsOrganisationName = "MyWorkplace"
};
// Choose an IFC file to work with
OpenFileDialog dialog = new OpenFileDialog();
dialog.ShowDialog();
string filename = dialog.FileName;
string newLine = Environment.NewLine;
// Check if the file is valid and continue
if (!filename.ToLower().EndsWith(".ifc"))
{
// Output error if the file is the wrong format
outputBox.AppendText(newLine + "Error: select an .ifc-file");
}
else
{
// Open the selected file (## Not sure what the response is to a corrupt/invalid .ifc-file)
using (var model = IfcStore.Open(filename, editor, 1.0))
{
// Output success when the file has been opened
string reversedName = Form1.ReversedString(filename);
int filenameShortLength = reversedName.IndexOf("\\");
string filenameShort = filename.Substring(filename.Length - filenameShortLength, filenameShortLength);
outputBox.AppendText(newLine + filenameShort + " opened successfully for editing");
////////////////////////////////////////////////////////////////////
// Get all the objects in the model ( ### lowest level only??? ###)
var objs = model.Instances.OfType<IfcObjectDefinition>();
////////////////////////////////////////////////////////////////////
// Create and store a new property
using (var txn = model.BeginTransaction("Store Costs"))
{
// Iterate over all the walls to initiate the Point Source property
foreach (var obj in objs)
{
// Create new property set to host properties
var pSetRel = model.Instances.New<IfcRelDefinesByProperties>(r =>
{
r.GlobalId = Guid.NewGuid();
r.RelatingPropertyDefinition = model.Instances.New<IfcPropertySet>(pSet =>
{
pSet.Name = "Economy";
pSet.HasProperties.Add(model.Instances.New<IfcPropertySingleValue>(p =>
{
p.Name = "Cost";
p.NominalValue = new IfcMonetaryMeasure(200.00); // Default Currency set on IfcProject
}));
});
});
// Add property to the object
pSetRel.RelatedObjects.Add(obj);
// Rename the object
outputBox.AppendText(newLine + "Cost property added to " + obj.Name);
obj.Name += "_withCost";
//outputBox.AppendText(newLine + obj.OwnerHistory.ToString());
}
// Commit changes to this model
txn.Commit();
};
// Save the changed model with a new name. Does not overwrite existing files but generates a unique name
string newFilename = filenameShort.Substring(0, filenameShort.Length - 4) + "_Modified.IFC";
int i = 1;
while (File.Exists(newFilename))
{
newFilename = filenameShort.Substring(0, filenameShort.Length - 4) + "_Modified(" + i.ToString() + ").IFC";
i += 1;
}
model.SaveAs(newFilename); // (!) Gets stored in the project folder > bin > Debug
outputBox.AppendText(newLine + newFilename + " has been saved");
};
}
}
// Reverse string-function
static string ReversedString(string text)
{
if (text == null) return null;
char[] array = text.ToCharArray();
Array.Reverse(array);
return new String(array);
}
private void Form1_Load(object sender, EventArgs e)
{
}
}
}
You're starting out by getting too broad a set of elements in the model. Pretty much everything in an IFC model will be classed as (or 'derived from') an instance of IfcObjectDefinition - including Spatial concepts (spaces, levels, zones etc) as well as more abstract concepts of Actors (people), Resources and the like.
You'd be better off filtering down objs to the more specific types such as IfcElement, IfcBuildingElement - or even the more real world elements below (IfcWindow, IfcDoor etc.)
// Get all the building elements in the model
var objs = model.Instances.OfType<IfcBuildingElement>();
You could also filter by more specific clauses more than just their type by using the other IFC relationships.
As a tutorial project I had to create a Coffee Machine Simulator using c#. I have completed this project successfully, but I would like the contents of the variables to be written to a file so that the user does not need to set it up again. I have tried this by attempting the demo project from Microsoft:
using System;
using System.Linq;
using System.IO;
using System.Reflection.Metadata;
using System.Text;
namespace TestingCode
{
class Program
{
public static void Main()
{
string path = "Test.txt";
try
{
// Create the file, or overwrite if the file exists.
using (FileStream fs = File.Create(path))
{
Console.WriteLine("Enter a string:");
string input = Console.ReadLine();
byte[] info = new UTF8Encoding(true).GetBytes(input);
// Add some information to the file.
fs.Write(info, 0, info.Length);
}
// Open the stream and read it back.
using (StreamReader sr = File.OpenText(path))
{
string s = "";
while ((s = sr.ReadLine()) != null)
{
Console.WriteLine(s);
}
}
}
catch (Exception ex)
{
Console.WriteLine(ex.ToString());
}
Console.ReadLine();
}
}
}
This code successfully runs and writes the value of the user input to a text file. Please can anybody help me write the variables into a text file and read from them as well.
Thanks,
KINGAWESOME266
Edit: you can install Newtonsoft.Json by using nugget package manager
The fastest and easiest way I can think of is using Newtonsoft.Json to serialize data
Create a class to store your variables:
Let's say we have this Model class which stores our variables
public class Model
{
public int Variable1;
public string Variable2;
public List<string> Variable3;
}
Here is our object that we want to serialize:
Model m = new Model()
{
Variable1 = 1,
Variable2 = "test test",
Variable3 = new List<string>() { "list element 1 ", "list element 2", "list element 3"}
};
To serialize this object call JsonConvert.SerializeObject with your object as param
The output of this is a json string, in our case:
{"Variable1":1,"Variable2":"test test","Variable3":["list element 1 ","list element 2","list element 3"]}
var serializedData = JsonConvert.SerializeObject(m);
Save the string to a file and read it back again
File.WriteAllText("serialized.txt", serializedData);
var loadedData = File.ReadAllText("serialized.txt");
Convert back the read string to an object by calling JsonConvert.DeserializeObject(yourObject);
var loadedObject = JsonConvert.DeserializeObject<Model>(loadedData);
Complete example:
static void Main(string[] args)
{
Model m = new Model()
{
Variable1 = 1,
Variable2 = "test test",
Variable3 = new List<string>() { "list element 1 ", "list element 2", "list element 3" }
};
var serializedData = JsonConvert.SerializeObject(m);
File.WriteAllText("serialized.txt", serializedData);
var loadedData = File.ReadAllText("serialized.txt");
var loadedObject = JsonConvert.DeserializeObject<Model>(loadedData);
}
I have a Windows Application where in MainWindow.xaml.cs I have a button click option to import, which goes off to run Main.Run, based on a spreadsheet that was dragged into a text box:
private void Btn_Import_Click(object sender, RoutedEventArgs e)
{
Main.Run(Global.bindingObjects.spreadsheetTxtBxFile);
}
The above code takes us to a Task, which should then go off to run DefunctRun in a different project, but it doesn't go to it when using a break point and F10 in debug:
internal static void Run(string spreadsheetpath)
{
Task task = new Task(
() =>
{
try
{
PICSObjects.DefunctFields.DefunctRun(spreadsheetpath);
}
finally
{
}
}
);
task.Start();
}
The code that it should go off and perform, which I need to have the spreadsheet path turned into a dataset which this class should do if it can be accessed.:
using InfExcelExtension;
using System.Data;
using System.IO;
using System.Diagnostics;
namespace PICSObjects
{
public static partial class DefunctFields
{
public static void DefunctRun(string spreadsheetpath)
{
//Sets up string values to be used later//
DataSet origdata = new DataSet();
DataSet newdata = new DataSet();
string filespath = #"";
string output = #".xlsx".ToString();
string searchtext = "";
string currentscript = "";
string readscript = "";
//Converts the above Path string (which should be a spreadsheet) into a dataset with
datatables//
origdata = ExcelToDataset.ToDataSet(spreadsheetpath);
//Sets up two new tables in new dataset//
newdata.Tables.Add("Scripts");
newdata.Tables.Add("Tables");
//Add columns to the new tables//
newdata.Tables["Scripts"].Columns.Add("ToRemove");
newdata.Tables["Scripts"].Columns.Add("ScriptName");
newdata.Tables["Scripts"].Columns.Add("RelatedTable");
newdata.Tables["Tables"].Columns.Add("TableName");
newdata.Tables["Tables"].Columns.Add("ScriptName");
//Sets the directory to browse in from the filespath specified at the top//
DirectoryInfo d = new DirectoryInfo(filespath.ToString());
//Goes through each file in specified directory that has .sql as the extension//
foreach (var file in d.GetFiles("*.sql"))
{
currentscript = file.Name.ToString();
readscript = File.ReadAllText(file.FullName).ToLower();
//Goes through each "Field" value from the column and sets to Lower Case//
foreach (DataRow dr in origdata.Tables["Fields"].Rows)
{
searchtext = dr["ToRemove"].ToString().ToLower();
//If the Field value appears in the file it's currently looking at, it'll put it
into our new dataset's new datatable//
if (readscript.Contains(searchtext))
{
DataRow row = newdata.Tables["Scripts"].NewRow();
row["ToRemove"] = searchtext;
row["ScriptName"] = currentscript;
row["RelatedTable"] = dr["Table"];
newdata.Tables["Scripts"].Rows.Add(row);
}
}
//Whilst going through the files in the specified folder, we also look at what tables from origdata that are mentioned in the files as these are the defunct tables and need flagging//
foreach (DataRow dr in origdata.Tables["Tables"].Rows)
{
searchtext = dr["Tables"].ToString();
if (readscript.Contains(searchtext))
{
DataRow row = newdata.Tables["Tables"].NewRow();
row["TableName"] = searchtext;
row["ScriptName"] = currentscript;
newdata.Tables["Tables"].Rows.Add(row);
}
}
}
newdata.ToWorkBook(output);
Process.Start(output);
}
}
}
Ive sorted this one now. Just needed a fair bit of tweaking, based on everyone's comments
Is there a way to use the SQL Server 2012 Microsoft.SqlServer.Dac Namespace to determine if a database has an identical schema to that described by a DacPackage object? I've looked at the API docs for DacPackage as well as DacServices, but not having any luck; am I missing something?
Yes there is, I have been using the following technique since 2012 without issue.
Calculate a fingerprint of the dacpac.
Store that fingerprint in the target database.
The .dacpac is just a zip file containing goodies like metadata, and
model information.
Here's a screen-grab of what you will find in the .dacpac:
The file model.xml has XML structured like the following
<DataSchemaModel>
<Header>
... developer specific stuff is in here
</Header>
<Model>
.. database model definition is in here
</Model>
</<DataSchemaModel>
What we need to do is extract the contents from <Model>...</Model>
and treat this as the fingerprint of the schema.
"But wait!" you say. "Origin.xml has the following nodes:"
<Checksums>
<Checksum Uri="/model.xml">EB1B87793DB57B3BB5D4D9826D5566B42FA956EDF711BB96F713D06BA3D309DE</Checksum>
</Checksums>
In my experience, this <Checksum> node changes regardless of a schema change in the model.
So let's get to it.
Calculate the fingerprint of the dacpac.
using System.IO;
using System.IO.Packaging;
using System.Security.Cryptography;
static string DacPacFingerprint(byte[] dacPacBytes)
{
using (var ms = new MemoryStream(dacPacBytes))
using (var package = ZipPackage.Open(ms))
{
var modelFile = package.GetPart(new Uri("/model.xml", UriKind.Relative));
using (var streamReader = new System.IO.StreamReader(modelFile.GetStream()))
{
var xmlDoc = new XmlDocument() { InnerXml = streamReader.ReadToEnd() };
foreach (XmlNode childNode in xmlDoc.DocumentElement.ChildNodes)
{
if (childNode.Name == "Header")
{
// skip the Header node as described
xmlDoc.DocumentElement.RemoveChild(childNode);
break;
}
}
using (var crypto = new SHA512CryptoServiceProvider())
{
byte[] retVal = crypto.ComputeHash(Encoding.UTF8.GetBytes(xmlDoc.InnerXml));
return BitConverter.ToString(retVal).Replace("-", "");// hex string
}
}
}
}
With this fingerprint now available, pseudo code for applying a dacpac can be:
void main()
{
var dacpacBytes = File.ReadAllBytes("<path-to-dacpac>");
var dacpacFingerPrint = DacPacFingerprint(dacpacBytes);// see above
var databaseFingerPrint = Database.GetFingerprint();//however you choose to do this
if(databaseFingerPrint != dacpacFingerPrint)
{
DeployDacpac(...);//however you choose to do this
Database.SetFingerprint(dacpacFingerPrint);//however you choose to do this
}
}
Here's what I've come up with, but I'm not really crazy about it. If anyone can point out any bugs, edge cases, or better approaches, I'd be much obliged.
...
DacServices dacSvc = new DacServices(connectionString);
string deployScript = dacSvc.GenerateDeployScript(myDacpac, #"aDb", deployOptions);
if (DatabaseEqualsDacPackage(deployScript))
{
Console.WriteLine("The database and the DacPackage are equal");
}
...
bool DatabaseEqualsDacPackage(string deployScript)
{
string equalStr = string.Format("GO{0}USE [$(DatabaseName)];{0}{0}{0}GO{0}PRINT N'Update complete.'{0}GO", Environment.NewLine);
return deployScript.Contains(equalStr);
}
...
What I really don't like about this approach is that it's entirely dependent upon the format of the generated deployment script, and therefore extremely brittle. Questions, comments and suggestions very welcome.
#Aaron Hudon answer does not account for post script changes. Sometimes you just add a new entry to a type table without changing the model. In our case we want this to count as new dacpac. Here is my modification of his code to account for that
private static string DacPacFingerprint(string path)
{
using (var stream = File.OpenRead(path))
using (var package = Package.Open(stream))
{
var extractors = new IDacPacDataExtractor [] {new ModelExtractor(), new PostScriptExtractor()};
string content = string.Join("_", extractors.Select(e =>
{
var modelFile = package.GetPart(new Uri($"/{e.Filename}", UriKind.Relative));
using (var streamReader = new StreamReader(modelFile.GetStream()))
{
return e.ExtractData(streamReader);
}
}));
using (var crypto = new MD5CryptoServiceProvider())
{
byte[] retVal = crypto.ComputeHash(Encoding.UTF8.GetBytes(content));
return BitConverter.ToString(retVal).Replace("-", "");// hex string
}
}
}
private class ModelExtractor : IDacPacDataExtractor
{
public string Filename { get; } = "model.xml";
public string ExtractData(StreamReader streamReader)
{
var xmlDoc = new XmlDocument() { InnerXml = streamReader.ReadToEnd() };
foreach (XmlNode childNode in xmlDoc.DocumentElement.ChildNodes)
{
if (childNode.Name == "Header")
{
// skip the Header node as described
xmlDoc.DocumentElement.RemoveChild(childNode);
break;
}
}
return xmlDoc.InnerXml;
}
}
private class PostScriptExtractor : IDacPacDataExtractor
{
public string Filename { get; } = "postdeploy.sql";
public string ExtractData(StreamReader stream)
{
return stream.ReadToEnd();
}
}
private interface IDacPacDataExtractor
{
string Filename { get; }
string ExtractData(StreamReader stream);
}
I want to send file attachment from fileupload into smartsheet. I was using sdk, I found a sample code for attachment.
This my code for attachment:
if (fileUpload.HasFile)
{
string fileName = fileUpload.PostedFile.FileName;
string sourceFile = Server.MapPath("~/")+fileName;
fileUpload.PostedFile.SaveAs(sourceFile);
string type = fileUpload.PostedFile.ContentType;
smartsheet.Sheets().Attachments().AttachFile(sheetId, sourceFile, type);
}
I read about AttachFile() method, must using ObjectId, but I do not understand how to get ObjectId, so I use sheetId.
Edit:
That code was correct, when I running my code, I found file attachment in tab attachment on my sheet, but I want to attach that file into new row I added.
Can you help me to solve this?
I still new to use smartsheet, and still learning how to use smartsheet api sdk c#, but I not found many example or sample code to use smartsheet api.
Here my full code:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using System.Web.UI;
using System.Web.UI.WebControls;
using Smartsheet.Api;
using Smartsheet.Api.Models;
using Smartsheet.Api.OAuth;
namespace smartsheet
{
public partial class TestInput : System.Web.UI.Page
{
protected void Page_Load(object sender, EventArgs e)
{
}
protected void btnSubmit_Click(object sender, EventArgs e)
{
try
{
string strToken = "649qjpt7pmq8i0xze5550hr12x";
long sheetId = 4204618734430084;
Token token = new Token();
token.AccessToken = strToken;
SmartsheetClient smartsheet = new SmartsheetBuilder().SetAccessToken(token.AccessToken).Build();
Smartsheet.Api.Models.Home home = smartsheet.Home().GetHome(new ObjectInclusion[] { ObjectInclusion.TEMPLATES });
List<Column> cols = new List<Column>(smartsheet.Sheets().Columns().ListColumns(sheetId));
Cell cell1 = new Cell();
cell1.ColumnId = cols[0].ID;
cell1.Value = txFirstname.Text;
Cell cell2 = new Cell();
cell2.ColumnId = cols[1].ID;
cell2.Value = txLastname.Text;
List<Cell> cells = new List<Cell>();
cells.Add(cell1);
cells.Add(cell2);
Row row = new Row();
row.Cells=cells;
List<Row> rows = new List<Row>();
rows.Add(row);
RowWrapper rowWrapper = new RowWrapper.InsertRowsBuilder().SetRows(rows).SetToBottom(true).Build();
smartsheet.Sheets().Rows().InsertRows(sheetId, rowWrapper);
if (fileUpload.HasFile)
{
string fileName = fileUpload.PostedFile.FileName;
string sourceFile = Server.MapPath("~/")+fileName;
fileUpload.PostedFile.SaveAs(sourceFile);
string type = fileUpload.PostedFile.ContentType;
smartsheet.Sheets().Attachments().AttachFile(sheetId, sourceFile, type);
}
}
catch (Exception ex)
{
LableMsg.Text = ex.Message.ToString();
}
}
}
}
I'm not a C# developer, but I am very familiar with Smartsheet's Java SDK, which C# SDK was modeled after, so there may be some slight syntax problems with the answer below, but it should give you the gist of things.
You are almost there - to attach a file to a new row, you'll just need to get the id of the new row you just created and then attach to it instead of the sheet. Change your line:
smartsheet.Sheets().Rows().InsertRows(sheetId, rowWrapper);
to
IList<Row> insertedRow = smartsheet.Sheets().Rows().InsertRows(sheetId, rowWrapper);
long newRowId = insertedRow.get(0).ID;
Now you can attach directly to that row:
smartsheet.Rows().Attachments().AttachFile(newRowId, sourceFile, type);