How to store and then execute linq2db query? - c#

I'm very new to C#. In other languages I'm used to work with DB in a way "prepare, then execute statement with needed parameters", so I could pass prepared query to function and do all such things.
Now I'm trying to solve this task: get almost identicaly result sets from different databases in different contexts. Like you have "files" table with files metadata and in one DB it does have "file_priority" field, and in other DB it does have "file_description" field and so on, but most fields are common and processing logic are almost same.
So I wanted to prepare query in outer code and pass it along with needed param values to data-processing function (or class).
I can't find any example of executing (pass param values to query) linq2db query.
==========UPD============
Ok, example. Let's talk about Perl.
package main1;
use FileProcessor;
my $sth = $dbh->prepare(qq[ SELECT name, extension, file_priority FROM files WHERE file_id = ? ]);
my $processor = new FileProcessor(query => $sth);
my $file_id = $ENV{file_id};
my $file_data = $processor->get_file(id => $file_id);
package main2;
use FileProcessor;
my $sth = $dbh->prepare(qq[ SELECT name, extension, file_description FROM files WHERE file_id = ? ]); # preparing query
my $processor = new FileProcessor(query => $sth);
my $file_id = $ENV{file_id};
my $file_data = $processor->get_file(id => $file_id);
package FileProcessor;
sub new {
my $class = shift;
my $self = {#_};
bless $self, $class;
}
sub get_file {
my $self = shift;
my %params = #_;
$self->{query}->execute($params{$file_id}); #passing params to query
my $file_data = $self->{query}->fetchrow_hashref; # fetching result from DB
# do something with file data
# ...
return $file_data;
}

linq2db will not give you preparation step. But you can run raw query via a lot of extensions:
db.Query<File>("SELECT name, extension, file_priority FROM files WHERE file_id = #fileid",
new DataPamater("#fileid", id));
db.Query<File>("SELECT name, extension, file_priority FROM files WHERE file_id = #fileid",
new { fileid = id });
It is not typical for linq2db to do that in such way, because it supports LINQ queries and parameters will be created automatically:
var id = 1;
var query =
from f in db.GetTable<File>()
where f.file_id = id
select new File
{
name = f.name,
extension = f.extension,
file_priority = f.file_priority
};
var file = query.FirstOrDefault();

Related

Multiple table same fields LINQ

I have to take the same values from multiple source and so I used Concat but I have large number of fields and couple of more sources too.
IEnumerable<Parts> partsList = (from parts in xml.XPathSelectElements("//APS/P")
select new WindchillPart
{
Code = (string)parts.Element("Number"),
Part = (string)parts.Element("KYZ"),
Name = (string)parts.Element("Name"),
})
.Concat(from uparts in xml.XPathSelectElements("//APS/U")
select new WindchillPart
{
Code = (string)uparts.Element("Number"),
Part = (string)uparts.Element("KYZ"),
Name = (string)uparts.Element("Name"),
});
I almost have 15 fields and 5 sources. So is there anyway to make the fields as common and just add the sources somewhere to work and simplify this?
You could create an array of all your pathes, and use SelectMany to get the elements. In the end, you call Select just once:
string[] pathes = new string[] { "//APS/P", "//APS/U" };
IEnumerable<Parts> partsList = pathes.SelectMany(path => xml.XPathSelectElements(path)).
Select(uparts => new WindchillPart
{
Code = (string)uparts.Element("Number"),
Part = (string)uparts.Element("KYZ"),
Name = (string)uparts.Element("Name"),
});

Azure Data Factory dynamic Copy Activity with multiple tables in source and destination using C# .NET

I have written this copy activity in C# .NET that moves data from Sql Server to Azure. It uses one table at present for both source and destination and here is my code
private DatasetResource CreateDataSetResourceSqlServer(string tableName)
{
DatasetResource sqlDatasetResource = new DatasetResource(
new SqlServerTableDataset
{
LinkedServiceName = new LinkedServiceReference(LinkedServiceReferenceNameSqlServer),
TableName = tableName
}
);
return sqlDatasetResource;
}
private CopyActivity CreateCopyActivity()
{
return new CopyActivity
{
Name = CopyActivityNameSqlServerToAzure,
Inputs = new List<DatasetReference>() { new DatasetReference() { ReferenceName = DataSetNameSqlServer } },
Outputs = new List<DatasetReference>() { new DatasetReference() { ReferenceName = DataSetNameAzureSql } },
Source = new SqlSource { SqlReaderQuery = "SELECT * FROM Table1" },
Sink = new SqlSink { }
};
}
this works but is not a scalable solution though DataSet creation is dynamic but CopyActivity isn't. I have 50 tables in source and destination and I have an idea that if I list all my table names in file and iterate through them but then how do i make CopyActivity with dynamic Source that can copy data for multiple tables. Also If new tables are added in future in file I don't have to write CopyActivity per table.
could any one help
you could use for each activity and the expression language. Please reference the example.
https://learn.microsoft.com/en-us/azure/data-factory/control-flow-for-each-activity#iteration-expression-language
Since you are using the C# you may or may not use foreach activity as you will be reading the tablename into a variable and loop it through . I could suggest you to create a parameter in ADF for the tablename and use that in Source and the Sink.

Must declare the scalar variable - Dapper

I do multiple mapping on the dapper. Then I try to implement dapper builder
But the it return exception:
Must declare the scalar variable \"#ExecutionId\".\r\nInvalid usage of the option NEXT in the FETCH statement.
Without multiple mapping, never give a problem
Here my snippet code
var Builder = new SqlBuilder();
var SelectedQuery = Builder.AddTemplate(# "SELECT e.[Id], e.[BuyOrderBookId], e.[SellOrderBookId], e.[Volume], e.[Price], e.[CreationDate], e.[StatusId], bo.[UserId], bo.[MarketId], so.[UserId] FROM[dbo].[Execution] AS e JOIN[dbo].[OrderBook] AS bo ON e.BuyOrderBookId = bo.Id JOIN[dbo].[OrderBook] as so ON e.SellOrderBookId = so.Id
/**where**/
ORDER BY e.[CreationDate] DESC OFFSET #skip ROWS FETCH NEXT #take ROWS ONLY;
");
//Execution ID
if (filter.ExecutionId.HasValue)
Builder.Where("e.[Id] = #ExecutionId", new {ExecutionId = filter.ExecutionId.Value
});
var query = await connection.QueryAsync < ExecutionViewModel, OrderBookViewModel, OrderBookViewModel, ExecutionViewModel > (SelectedQuery.RawSql, (execute, buyOrder, sellOrder) => {
execute.BuyUserId = buyOrder.UserId;
execute.SellUserId = sellOrder.UserId;
execute.MarketId = buyOrder.MarketId;
return execute;
},
splitOn: "UserId,UserId",
param: new {
SelectedQuery.Parameters,
skip = (pagingParam.PageNumber - 1) * pagingParam.PageSize,
take = pagingParam.PageSize
});
Anyone know did I do something wrong here?
Update
I just fix like this
if (filter.ExecutionId.HasValue)
Builder.Where(String.Format("e.[Id] = {0}",filter.ExecutionId));
I believe this is not good way to implement. Is risk sql injection.
Try changing the where clause like this:
if (filter.ExecutionId.HasValue)
Builder.Where("e.[Id]", new {Id = filter.ExecutionId.Value});
You can add parameters like this.
if (filter.ExecutionId.HasValue)
{
Builder.Where("e.[Id] = #ExecutionId");
((DynamicParameters)SelectedQuery.Parameters)
.AddDynamicParams(new {
ExecutionId = filter.ExecutionId.Value
});
}

Passing parameter to CosmosDB stored procedure

I have a Cosmos DB stored procedure in which I am passing list of comma saperated Ids. I need to pass those IDs to in query. when I'm passing one value to the parameter then its working fine but not with more that one value.
It would be great if any one could help here.
below is the code of the stored procedure:
function getData(ids) {
var context = getContext();
var coll = context.getCollection();
var link = coll.getSelfLink();
var response = context.getResponse();
var query = {query: "SELECT * FROM c where c.vin IN (#ids)", parameters:
[{name: "#ids", value: ids}]};
var requestOptions = {
pageSize: 500
};
var run = coll.queryDocuments(link, query, requestOptions, callback);
function callback(err, docs) {
if (err) throw err;
if (!docs || !docs.length) response.setBody(null);
else {
response.setBody(JSON.stringify(docs));
}
}
if (!run) throw new Error('Unable to retrieve the requested information.');
}
For arrays, you should use ARRAY_CONTAINS function:
var query = {
query: "SELECT * FROM c where ARRAY_CONTAINS(#ids, c.vin)",
parameters: [{name: "#ids", value: ids}]
};
Also, it is possible that, as stated in this doc, your #ids array is being sent as string
When defining a stored procedure in Azure portal, input parameters are always sent as a string to the stored procedure. Even if you pass an array of strings as an input, the array is converted to string and sent to the stored procedure. To work around this, you can define a function within your stored procedure to parse the string as an array
So you might need to parse it before querying:
function getData(ids) {
arr = JSON.parse(ids);
}
Related:
How can I pass array as a sql query param for cosmos DB query
https://github.com/Azure/azure-cosmosdb-node/issues/156
This is how you can do it:
Inside the stored procedure
parse your one parameter into an array using the split function
loop through the array and
a) build the parameter name / value pair and push it into the parameter array used by the query later
b) use the parameter name to build a string for use inside the parenthesis of the IN statement
Build the query definition and pass it to the collection.
Example
This is how the value of the parameter looks: "abc,def,ghi,jkl"
If you are going to use this, replace "stringProperty" with the name of the property you are querying against.
// SAMPLE STORED PROCEDURE
function spArrayTest(arrayParameter) {
var collection = getContext().getCollection();
var stringArray = arrayParameter.split(",");
var qParams = [];
var qIn = "";
for(var i=0; i<stringArray.length; i++){
var nm = '#p'+ i; // parameter name
qParams.push({name: nm, value: stringArray[i]});
qIn += ( nm +','); // parameter name for query
}
qIn = qIn.substring(0,qIn.length-1); // remove last comma
// qIn only contains a list of the names in qParams
var qDef = 'SELECT * from documents d where d.stringProperty in ( ' + qIn + ' )';
console.log(qParams[0].name);
// Query Definition to be passed into "queryDocuments" function
var q = {
query: qDef,
parameters: qParams
};
// Query documents
var isAccepted = collection.queryDocuments(
collection.getSelfLink(),
q,
function (err, feed, options) {
if (err) throw err;
// Check the feed and if empty, set the body to 'no docs found', 
// else return all documents from feed
if (!feed || !feed.length) {
var response = getContext().getResponse();
response.setBody('no docs found with an stringProperty in ' + arrayParameter);
}
else {
var response = getContext().getResponse();
response.setBody(feed);
}
});
if (!isAccepted) throw new Error('The query was not accepted by the server.');
}
Please refer to my sample js code, it works for me.
function sample(ids) {
var collection = getContext().getCollection();
var query = 'SELECT * FROM c where c.id IN ('+ ids +')'
console.log(query);
var isAccepted = collection.queryDocuments(
collection.getSelfLink(),
query,
function (err, feed, options) {
if (err) throw err;
if (!feed || !feed.length) getContext().getResponse().setBody('no docs found');
else {
for(var i = 0;i<feed.length;i++){
var doc = feed[i];
doc.name = 'a';
collection.replaceDocument(doc._self,doc,function(err) {
if (err) throw err;
});
}
getContext().getResponse().setBody(JSON.stringify("success"));
}
});
if (!isAccepted) throw new Error('The query was not accepted by the server.');
}
Parameter : '1','2','3'
Hope it helps you.

Speeding up a linq query with 40,000 rows

In my service, first I generate 40,000 possible combinations of home and host countries, like so (clientLocations contains 200 records, so 200 x 200 is 40,000):
foreach (var homeLocation in clientLocations)
{
foreach (var hostLocation in clientLocations)
{
allLocationCombinations.Add(new AirShipmentRate
{
HomeCountryId = homeLocation.CountryId,
HomeCountry = homeLocation.CountryName,
HostCountryId = hostLocation.CountryId,
HostCountry = hostLocation.CountryName,
HomeLocationId = homeLocation.LocationId,
HomeLocation = homeLocation.LocationName,
HostLocationId = hostLocation.LocationId,
HostLocation = hostLocation.LocationName,
});
}
}
Then, I run the following query to find existing rates for the locations above, but also include empty the missing rates; resulting in a complete recordset of 40,000 rows.
var allLocationRates = (from l in allLocationCombinations
join r in Db.PaymentRates_AirShipment
on new { home = l.HomeLocationId, host = l.HostLocationId }
equals new { home = r.HomeLocationId, host = (Guid?)r.HostLocationId }
into matches
from rate in matches.DefaultIfEmpty(new PaymentRates_AirShipment
{
Id = Guid.NewGuid()
})
select new AirShipmentRate
{
Id = rate.Id,
HomeCountry = l.HomeCountry,
HomeCountryId = l.HomeCountryId,
HomeLocation = l.HomeLocation,
HomeLocationId = l.HomeLocationId,
HostCountry = l.HostCountry,
HostCountryId = l.HostCountryId,
HostLocation = l.HostLocation,
HostLocationId = l.HostLocationId,
AssigneeAirShipmentPlusInsurance = rate.AssigneeAirShipmentPlusInsurance,
DependentAirShipmentPlusInsurance = rate.DependentAirShipmentPlusInsurance,
SmallContainerPlusInsurance = rate.SmallContainerPlusInsurance,
LargeContainerPlusInsurance = rate.LargeContainerPlusInsurance,
CurrencyId = rate.RateCurrencyId
});
I have tried using .AsEnumerable() and .AsNoTracking() and that has sped things up quite a bit. The following code shaves several seconds off of my query:
var allLocationRates = (from l in allLocationCombinations.AsEnumerable()
join r in Db.PaymentRates_AirShipment.AsNoTracking()
But, I am wondering: How can I speed this up even more?
Edit: Can't replicate foreach functionality in linq.
allLocationCombinations = (from homeLocation in clientLocations
from hostLocation in clientLocations
select new AirShipmentRate
{
HomeCountryId = homeLocation.CountryId,
HomeCountry = homeLocation.CountryName,
HostCountryId = hostLocation.CountryId,
HostCountry = hostLocation.CountryName,
HomeLocationId = homeLocation.LocationId,
HomeLocation = homeLocation.LocationName,
HostLocationId = hostLocation.LocationId,
HostLocation = hostLocation.LocationName
});
I get an error on from hostLocation in clientLocations which says "cannot convert type IEnumerable to Generic.List."
The fastest way to query a database is to use the power of the database engine itself.
While Linq is a fantastic technology to use, it still generates a select statement out of the Linq query, and runs this query against the database.
Your best bet is to create a database View, or a stored procedure.
Views and stored procedures can easily be integrated into Linq.
Material Views ( in MS SQL ) can further speed up execution, and missing indexes are by far the most effective tool in speeding up database queries.
How can I speed this up even more?
Optimizing is a bitch.
Your code looks fine to me. Make sure to set the index on your DB schema where it's appropriate. And as already mentioned: Run your Linq against SQL to get a better idea of the performance.
Well, but how to improve performance anyway?
You may want to have a glance at the following link:
10 tips to improve LINQ to SQL Performance
To me, probably the most important points listed (in the link above):
Retrieve Only the Number of Records You Need
Turn off ObjectTrackingEnabled Property of Data Context If Not
Necessary
Filter Data Down to What You Need Using DataLoadOptions.AssociateWith
Use compiled queries when it's needed (please be careful with that one...)

Categories

Resources