I'm developing an application using C# 4.0 and SQL Server 2008 R2 Express, my application needs to store and retrieve files (docx, pdf, png) locally and remotely, which approach would be the best?
Store the files in separate database (problem: restricted to 10 GB)
Use a windows shared folder (who to do?)
Use an FTP server (which server and library and how to do?)
SQL Server supports FILESTREAM, so if you have enough control over the SQL Server install to enable that feature then it seems like a good fit for you.
FILESTREAM integrates the SQL Server Database Engine with an NTFS file system by storing varbinary(max) binary large object (BLOB) data as files on the file system. Transact-SQL statements can insert, update, query, search, and back up FILESTREAM data. Win32 file system interfaces provide streaming access to the data.
Files stored directly in the file system with FILESTREAM don't count towards the database size because they aren't stored in the DB.
To confirm with an official source: https://learn.microsoft.com/en-us/sql/relational-databases/blob/filestream-compatibility-with-other-sql-server-features
SQL Server Express supports FILESTREAM. The 10-GB database size limit does not include the FILESTREAM data container.
Related
I'm trying to download a big file from FTP and store it somewhere into some azure storage and then run BulkInsert in sql on azure and save data into a table
I have an FTP server that I read data as CSV files from. Some of those files are very heavy about 1.5GB or even more. So far, I have been downloading these files into memory and then save them to the database using C# BulkCopy on azure but now I'm getting this error OutOfMemoryException which seems to be due to the size of the file.
That's why I'm thinking about using BulkInsert directly from SQL on azure but then that SQL instance needs access to the storage that the file is downloaded to and of course it cannot be my local machine because it seems like I can not run BulkInsert command on SQL Server on Azure when the source file is located on my local storage.
Is there any way of download and save a file into Azure storage that SQL has access to and then execute BulkInsert?
You can using the Data Factory to copy the data from the FTP to Azure SQL.
Data factory has the better performance to transfer big data:
Data Factory support the FTP as connector.
Please reference these tutorials:
Copy data from FTP server by using Azure Data Factory
Copy data to or from Azure SQL Database by using Azure Data Factory
Copy data from Azure Blob storage to a SQL database by using the
Copy Data tool
How can I download the file directly from FTP into that storage on azure?
You use create a pipeline, using FTP as Source, Storage as linked server.
You can also copy a big file from FTP to Azure SQL directly.
Hope this helps.
Don't think it's possible but would like to confirm? I am guessing you have to use MediaCapture.StartRecordToStorageFileAsync method. Which will record it into the file system. Then you can transfer it from the file system into a database?
You can record to a non-file based stream as well with MediaCapture.StartRecordToStreamAsync and you may be able to wire that stream to your SQL database without saving to a file first.
How exactly you'd wire it up depends on what sort of database connection you are using and if and how it can consume a stream. Windows Runtime doesn't have a built-in SQL database. Typically one uses a local database such as SQLite or a remote database through a web service. For the latter you can use the recording stream as the source for an HttpStreamContent.
Our ASP.NET web app lets users import data from Excel. We're currently hosted on our own servers, so have access to the file system both from the web server and the SQL server. The process currently works like this:
Save uploaded file to temp folder on server
Execute T-SQL to read uploaded file directly from the file system to a SQL temp table via OLEDB
Execute T-SQL to read data from temp table and process as needed (e.g. compare to existing data and update/insert as appropriate)
Step 2 looks something like this:
Select * into #MY_TEMP_TABLE
From OpenRowSet(
'Microsoft.ACE.OLEDB.12.0',
'Excel 12.0; Database=PATH_TO_MY_UPLOADED_FILE; HDR=YES',
'Select * From MY_WORKSHEET_NAME%')
This is very fast and straightforward compared to (for example) reading the file into a datatable in .NET using EPPlus and then inserting the data row by row.
We're in the process of making the app Azure-ready (Azure Website and SQL Database, not VM). We can upload the file to blob storage and get its contents into a byte array, but then we're stuck with the row-by-row processing approach, which is slow and brittle.
Is there a fast way to programmatically bulk upload Excel to SQL in Azure?
I'd look at one of the commercial Excel components from the likes of ComponentOne. Read the spreadsheet's contents into memory and the write it into Azure SQL Database using standard ADO.Net techniques.
This will probably be more reliable and you can utilise retry logic for transient failures (http://www.asp.net/aspnet/overview/developing-apps-with-windows-azure/building-real-world-cloud-apps-with-windows-azure/transient-fault-handling).
Note that you need to be aware of Throttling behaviour in Azure SQL Database and how it might impact your app: http://msdn.microsoft.com/en-us/library/azure/dn338079.aspx
I want to use SQLite database which is on FTP server without downloading it. Is it possible to use this database directly?
No, the FTP protocol is designed to sequentially transfer the entire contents of files. There is no way to perform random reads/writes to a file, which is necessary for SQLite (or any database program) to work.
The connection strings for data providers used by SQLite only supports UNC paths, URL parameters are not supported. You must download the file locally.
I do not now how SQLite works, but if you have your database-files on the server, you can mount the the filesystem over ftp and run a local SQLite-server which talks to the mounted files.
According to how the ftp-protocol is designed; If your database is just one file, the system will download the hole file even if you just want the first row, each time the file are needed(if we not use file-cache). If your database is multiple files each file will be downloaded when they are needed. As Pavel Krymets said, it will be slow, so it is not recommended.
I am creating an application for some user to maintain records in database. For this, I'll have to write SQL query (C#) and create the database, if does not exist, when user starts/installs the application. To make the creation and backup procedure easier, I want to create a separate file for SQL Server that will be used to store data. This file will be included in installation pack and copied to the destination folder to be used by SQL Server.
I've seen that we can create such file but never used like this.
Is it possible to accomplish the job I am trying to do?
i think that if each application have its own db you should use or SqlCe or SqlLite
They are a self-contained, serverless, zero-configuration, transactional SQL database engine.
So you don't have to install sql express on every pc.
They use a subset of tsql and you can do almost the same thing as sql server
you could embed a file in the application with all the sql command to create the db and then execute it the first time the application start.
Anyway you can distribute your app with the db already created and ready to use: it is just a .sdf file
If you think to distribute your app with clickonce than Sqlce is better becouse clickonce recognize its file format and handle it during the application updates
If you want to use SQL Server, you can use the Compact Edition: http://msdn.microsoft.com/en-us/data/ff687142
SQL Server CE databases are stored in .sdf files (up to 4GB) that can be shipped with your application. That way, if you want to connect to a full SQL Server database later, you could just change the connection strings in your application config.
You could make a backup and then restore it http://www.dotnetspider.com/forum/162986-database-backup-restore-through-C.aspx