How can i use copy command for bulk insertion of data into postgresql table from csv file present in remote machine using C#?
I've a front end in C# using which i need to load data into postgresql table from csv file. The database is in remote server and the csv file is in my machine.
Assuming the CSV file is on the machine running the C# program, you need to use COPY ... FROM STDIN. This is difficult to use directly from client drivers, but most of them support their own interfaces on top.
I'm guessing you are using nPgSQL since you didn't bother to mention what client you're using. If so, you can use NpgsqlCopyIn.
Not possible at all, unless you mount the remote machine on the postgres server. To copy postgres needs to be able to access the file locally.
You could try to scp the file from A to B, or parse the file yourself and do bulk inserts into postgres, ie:
create table file (structure of the file);
Read 100 lines
insert into tabe file
values (line 1)
,(line 2)
...
,line (100)
Related
I am using Microsoft Azure for my web application (C#) which is using an SQL Database on there. I have a very important SELECT statement that I run so it can return some data for me. I need to save that data as a .txt file, Text(tab delimited) .
I ran the query on the database using the Query Editor inside Azure. The issue I am facing is that the Query Editor seems to only allow export it as .json, .CSV, .XML.
See image screenshot below, the dropdown does not contain .txt format.
How can I export that data as a .txt from this tool within Azure?
I can't copy + paste the results into a notepad file straight from the query editor because it does not come out formatted correctly, it needs to be in Text(tab delimited)
Currently the only way I have managed to find a work around is by having to use my actual local machine with SSMS SQL Server Management Studio to connect to the Azure database in the cloud and THEN I can run the query and SAVE AS a .txt (all from my local computer which has SSMS installed).
I can't do this when I am using a machine that does not have the database connection, I can't do this when I am on the road and do not have my actual desktop with SSMS installed... so it is important that I am able to log into my Azure portal and save the data in the file format I need straight from Azure itself...
Any suggestions on how to save as a .txt Text(tab delimited) straight from an Azure sql database?
I created azure sql database and created table inserted values into that to save the data I got .csv, .json, .xml format is available in azure portal
As per my knowledge we cannot convert result to text format in azure portal itself. As per this Azure datastudion is also not supported for this operation. So I connected the database to SSMS
I executed the query which I want to convert into txt file
and right click on result I got below list I selected save result as option:
I got the file path to save the txt file named it to result.txt and saved it.
result.txt file:
By using SSMS I am getting perfect result of query as mentioned above. If you are unable to do with SSMS try with BCP command in command prompt:
bcp "SELECT * FROM dbo.product" queryout <filepath>test.txt -S <servername>.database.windows.net -d <databasename> -U <username> -P <password>
you will get result as below:
I need to import xml file into Microsoft SQL Server database. And I have a xml schema file (.xsd) that contains the mapping info for the xml file to be imported. Each valid data line in xml file need to be imported into a database table row with several columns.
Originally I was using SQLXMLBulkLoad (Non Transaction Mode - specify a database connection string) to do the job and it works just fine, however, there is a new requirement that need me to execute a stored procedure on the same database connection that SQLXMLBulkLoad uses. The purpose of this stored procedure is to bind the connection with a specific ID for the current user because there might be multiple users importing xml file at the same time. So I have to use transaction mode instead (- specify a database connection rather than a connection string for SQLXMLBulkLoad) and which introduce the well known share issue under remote environment. If the xml file is local and the db server is remote, I need to create a shared folder for SQLXMLBulkLoad to work. This is bad but can be done.
The problem is the share folder thing is not accepted by our clients, so I need a better solution that could satisfy the new requirement without the share issue.
I've Googled a while and found one solution that might work - SqlBulkCopy, but I have trouble with parsing the schema file (.xsd) because it contains many xml-db:table mapping information.
Any idea would be appreciated.
I need to check the status (existing? or last modified date) of a file on multiple remote windows servers (in LAN). The remote servers need a user name and password to access.
I was trying to do it using T-SQL (sql server 2005), but just thinking it's maybe best to be done using a CLR procedure/function? The reason for using a stored procedure is that this will be used by a ssrs report to show a list of each server has the file (last modified date) or not. The parameter for the procedure/function should be the unc/path of the file on the server.
I knows about CLR, but need the c# code to do this. Thanks.
If the files in question are locatable via UNC path names then this is quite straightforward using c#:
var fInfo = new FileInfo("uncPathGoesHere");
if (fInfo.LastWriteTime > DateTime.Now.AddHours(-1))
{
// file modified within last hour
}
I am using VS2010 , and I built a .mdf file using SQL server 2008
I want to use this database file in my wpf application so that I can add rows to it and delete rows from it
the problem is , I can't access this file , and all the insertion and deleting is actually hapening to the datacontexct i created .
I used myDataContexct.Submitchanges() but it didn't work either
I tried to add a connection string when I define the datacontexct that holds the url of my .mdf file and this it gave me a runtime error when trying to access this file and the error messege says :
An attempt to attach an auto-named database for file Trial.mdf failed. A database with the same name exists, or specified file cannot be opened, or it is located on UNC share.
please help me because I searched alot but I couldn't find any help
If the application is not going to be installed in a manner that many clients are accessing the same server, you would want to consider using SQL Server Compact Edition.
Are you sure the connection string in your app.config refers to the local mdf file? perhaps it refers to the server instance?
What technology do you use, is it LinqToSql or Entity Framework (I think you have t call SaveChanges, not AcceptChanges)?
If you do intend to access the server instance, then the problem seems to be a security restriction.
Please add more details on statement no. 1, and I'll write further info.
I am developing a desktop application in c# & sql 2005. With candidate data entry form I want to provide option to attach required documents(in pdf format) wid data. Kindly let me know the best method. Thank you in advance.
Simply create a table that will contain the filename and server path for the file to be attached, then create a method to copy the attached file to the server location and store the relivant information (name and path) in the table. Use other methods to retrive the file from the server location when requested. Simple.
I personaly prefer to store the documents as BLOBs since server file structures and paths can change over time.
Well then unfortunately you have to either manage the file storage yourself using the servers file system, or you could store it in the db itself (IT WILL GET BLOATED!!!)
See sql server 2005 file storage
How To: Encrypt and Manage Documents with SQL Server 2005
OK, then for file management
see this example
File Manager Component