Lucene.NET and searching on multiple fields with specific values - c#

I've created an index with various bits of data for each document I've added, each document can differ in it field name.
Later on, when I come to search the index I need to query it with exact field/ values - for example:
FieldName1 = X AND FieldName2 = Y AND FieldName3 = Z
What's the best way of constructing the following using Lucene .NET:
What analyser is best to use for this exact match type?
Upon retrieving a match, I only need one specific field to be returned (which I add to each document) - should this be the only one stored?
Later on I'll need to support keyword searching (so a field can have a list of values and I'll need to do a partial match).
The fields and values come from a Dictionary<string, string>. It's not user input, it's constructed from code.
Thanks,
Kieron

Well, I figured it out eventually - here's my take on it (this could be completely wrong, but it works for):
public Guid? Find (Dictionary<string, string> searchTerms)
{
if (searchTerms == null)
throw new ArgumentNullException ("searchTerms");
try
{
var directory = FSDirectory.Open (new DirectoryInfo (IndexRoot));
if (!IndexReader.IndexExists (directory))
return null;
var mainQuery = new BooleanQuery ();
foreach (var pair in searchTerms)
{
var parser = new QueryParser (
Lucene.Net.Util.Version.LUCENE_CURRENT, pair.Key, GetAnalyzer ());
var query = parser.Parse (pair.Value);
mainQuery.Add (query, BooleanClause.Occur.MUST);
}
var searcher = new IndexSearcher (directory, true);
try
{
var results = searcher.Search (mainQuery, (Filter)null, 10);
if (results.totalHits != 1)
return null;
return Guid.Parse (searcher.Doc (results.scoreDocs[0].doc).Get (ContentIdKey));
}
catch
{
throw;
}
finally
{
if (searcher != null)
searcher.Close ();
}
}
catch
{
throw;
}
}

Related

How to convert SQLQuery to SortedList through EF6

I have an Entity Framework 6 class called Materials, which is reflected in my database as a table with the same name. Using a parent parameter, I need to return a sorted list of materials from a SQL Query, so that I can later check that edits the user makes do not affect the order. My SQL is a stored procedure that looks like this:
CREATE PROC [dbo].[GET_SortedMaterials](#FinishedGoodCode VARCHAR(50))
AS
SELECT
ROW_NUMBER() OVER (ORDER BY Component.Percentage_of_Parent DESC,Material.Material) AS _sortField
,Material.*
FROM
Components AS Component
INNER JOIN Materials AS Material ON Component.Child_Material = Material.Material
WHERE
Component.Parent_Code = #FinishedGoodCode
ORDER BY
Component.Percentage_of_Parent DESC
,Material.Material
As you can see, the orderby field is not included in the Material. For this reason, I felt I could not return just a set of Material objects and still keep the sorting - I have performed the ordering in SQL and added the _sortField (I think that field may be a bad idea).
My C# code to read the SQL looks like this:
public async Task<SortedList<int, Materials>> GET_SortedMaterials(IProgress<Report> progress, string finishedGoodCode)
{
try
{
var report = new Report { Message = "Retrieving Sorted Materials", NewLine = true, StatusCode = Enums.StatusCode.Working };
progress.Report(report);
using (var context = new DBContext())
{
var ingredientList = await context.Database.SqlQuery<(int _sortField,Materials mat)>("[app].[GET_Customers]").ToListAsync();
var sorted = new SortedList<int, Raw_Materials>();
foreach (var (_sortField, mat) in ingredientList.OrderBy(x=>x._sortField))
{
sorted.Add(_sortField, mat);
}
return sorted;
}
}
catch (Exception ex)
{ [EXCLUDED CODE]
}
}
When the code executes, I get the correct number of rows returned, but I do not get a Sorted list where the Key corresponds to the _sortField value and the Value to the Material value. I have tried various different versions of basically the same code and I cannot get the script to return a list of materials with information about their sorting, instead, the conversion to EF class fails entirely and I only get null values back:
Any advice about how to return a sorted list from SQL and maintain the sorting in C#, when the sort field is not in the return values would be very gratefully received.
use
var ingredientList = await context.Database.SqlQuery<Materials>("[app].[GET_Customers]").Select((mat, _sortField) => (_sortField, mat)).ToDictionary(x => x._sortField, x => x.mat);
or if you want async load use
var ingredientList = await context.Database.SqlQuery<Materials>("[app].[GET_Customers]").ToListAsync().Result.Select((mat, _sortField) => (_sortField, mat)).ToDictionary(x => x._sortField, x => x.mat);
full code
public async Task<SortedList<int, Materials>> GET_SortedMaterials(IProgress<Report> progress, string finishedGoodCode)
{
try
{
var report = new Report { Message = "Retrieving Sorted Materials", NewLine = true, StatusCode = Enums.StatusCode.Working };
progress.Report(report);
using (var context = new DBContext())
{
var ingredientList = await context.Database.SqlQuery<Materials>("[app].[GET_Customers]").ToListAsync().Result.Select((mat, _sortField) => (_sortField, mat)).ToDictionary(x => x._sortField, x => x.mat);
var sorted = new SortedList<int, Raw_Materials>();
foreach (var item in ingredientList.OrderBy(x => x.Key))
{
sorted.Add(item.Key, item.Value);
}
return sorted;
}
}
catch (Exception ex)
{
[EXCLUDED CODE]
}
}

How to replace placeholders within email template dynamically

Is there a good way to replace placeholders with dynamic data ?
I have tried loading a template and then replaced all {{PLACEHOLDER}}-tags, with data from the meta object, which is working.
But if I need to add more placeholders I have to do it in code, and make a new deployment, so if it is possible I want to do it through the database, like this:
Table Placeholders
ID, Key (nvarchar(50), Value (nvarchar(59))
1 {{RECEIVER_NAME}} meta.receiver
2 {{RESOURCE_NAME}} meta.resource
3 ..
4 .. and so on
the meta is the name of the parameter sent in to the BuildTemplate method.
So when I looping through all the placeholders (from the db) I want to cast the value from the db to the meta object.
Instead of getting "meta.receiver", I need the value inside the parameter.
GetAllAsync ex.1
public async Task<Dictionary<string, object>> GetAllAsync()
{
return await _context.EmailTemplatePlaceholders.ToDictionaryAsync(x => x.PlaceholderKey, x => x.PlaceholderValue as object);
}
GetAllAsync ex.2
public async Task<IEnumerable<EmailTemplatePlaceholder>> GetAllAsync()
{
var result = await _context.EmailTemplatePlaceholders.ToListAsync();
return result;
}
sample not using db (working))
private async Task<string> BuildTemplate(string template, dynamic meta)
{
var sb = new StringBuilder(template);
sb.Replace("{{RECEIVER_NAME}}", meta.receiver?.ToString());
sb.Replace("{{RESOURCE_NAME}}", meta.resource?.ToString());
return sb.ToString();
}
how I want it to work
private async Task<string> BuildTemplate(string template, dynamic meta)
{
var sb = new StringBuilder(template);
var placeholders = await _placeholders.GetAllAsync();
foreach (var placeholder in placeholders)
{
// when using reflection I still get a string like "meta.receiver" instead of meta.receiver, like the object.
// in other words, the sb.Replace methods gives the same result.
//sb.Replace(placeholder.Key, placeholder.Value.GetType().GetField(placeholder.Value).GetValue(placeholder.Value));
sb.Replace(placeholder.Key, placeholder.Value);
}
return sb.ToString();
}
I think it might be a better solution for this problem. Please let me know!
We have solved similar issue in our development.
We have created extension to format any object.
Please review our source code:
public static string FormatWith(this string format, object source, bool escape = false)
{
return FormatWith(format, null, source, escape);
}
public static string FormatWith(this string format, IFormatProvider provider, object source, bool escape = false)
{
if (format == null)
throw new ArgumentNullException("format");
List<object> values = new List<object>();
var rewrittenFormat = Regex.Replace(format,
#"(?<start>\{)+(?<property>[\w\.\[\]]+)(?<format>:[^}]+)?(?<end>\})+",
delegate(Match m)
{
var startGroup = m.Groups["start"];
var propertyGroup = m.Groups["property"];
var formatGroup = m.Groups["format"];
var endGroup = m.Groups["end"];
var value = propertyGroup.Value == "0"
? source
: Eval(source, propertyGroup.Value);
if (escape && value != null)
{
value = XmlEscape(JsonEscape(value.ToString()));
}
values.Add(value);
var openings = startGroup.Captures.Count;
var closings = endGroup.Captures.Count;
return openings > closings || openings%2 == 0
? m.Value
: new string('{', openings) + (values.Count - 1) + formatGroup.Value
+ new string('}', closings);
},
RegexOptions.Compiled | RegexOptions.CultureInvariant | RegexOptions.IgnoreCase);
return string.Format(provider, rewrittenFormat, values.ToArray());
}
private static object Eval(object source, string expression)
{
try
{
return DataBinder.Eval(source, expression);
}
catch (HttpException e)
{
throw new FormatException(null, e);
}
}
The usage is very simple:
var body = "[{Name}] {Description} (<a href='{Link}'>See More</a>)";
var model = new { Name="name", Link="localhost", Description="" };
var result = body.FormatWith(model);
You want to do it like this:
sb.Replace(placeholder.Key, meta.GetType().GetField(placeholder.Value).GetValue(meta).ToString())
and instead of meta.reciever, your database would just store receiver
This way, the placeholder as specified in your database is replaced with the corresponding value from the meta object. The downside is you can only pull values from the meta object with this method. However, from what I can see, it doesn't seem like that would be an issue for you, so it might not matter.
More clarification: The issue with what you tried
//sb.Replace(placeholder.Key, placeholder.Value.GetType().GetField(placeholder.Value).GetValue(placeholder.Value));
is that, first of all, you try to get the type of the whole string meta.reciever instead of just the meta portion, but then additionally that there doesn't seem to be a conversion from a string to a class type (e.g. Type.GetType("meta")). Additionally, when you GetValue, there's no conversion from a string to the object you need (not positive what that would look like).
As you want to replace all the placeholders in your template dynamically without replacing them one by one manually. So I think Regex is better for these things.
This function will get a template which you want to interpolate and one object which you want to bind with your template. This function will automatically replace your placeholders like
{{RECEIVER_NAME}} with values in your object.
You will need a class which contain all the properties that you want to bind. In this example by class is MainInvoiceBind.
public static string Format(string obj,MainInvoiceBind invoice)
{
try
{
return Regex.Replace(obj, #"{{(?<exp>[^}]+)}}", match =>
{
try
{
var p = Expression.Parameter(typeof(MainInvoiceBind), "");
var e = System.Linq.Dynamic.DynamicExpression.ParseLambda(new[] { p }, null, match.Groups["exp"].Value);
return (e.Compile().DynamicInvoke(invoice) ?? "").ToString();
}
catch
{
return "Nill";
}
});
}
catch
{
return string.Empty;
}
}
I implement this technique in a project where I hade to generates email dynamically from there specified templates. Its working good for me. Hopefully, Its solve your problem.
I updated habibs solution to the more current System.Linq.Dynamic.Core NuGet package, with small improvements.
This function will automatically replace your placeholders like {{RECEIVER_NAME}} with data from your object. You can even use some operators, since it's using Linq.
public static string Placeholder(string input, object obj)
{
try {
var p = new[] { Expression.Parameter(obj.GetType(), "") };
return Regex.Replace(input, #"{{(?<exp>[^}]+)}}", match => {
try {
return DynamicExpressionParser.ParseLambda(p, null, match.Groups["exp"].Value)
.Compile().DynamicInvoke(obj)?.ToString();
}
catch {
return "(undefined)";
}
});
}
catch {
return "(error)";
}
}
You could also make multiple objects accessible and name them.

Linq query for building a dictionary from a reg file

I'm building a simple dictionary from a reg file (export from Windows Regedit). The .reg file contains a key in square brackets, followed by zero or more lines of text, followed by a blank line. This code will create the dictionary that I need:
var a = File.ReadLines("test.reg");
var dict = new Dictionary<String, List<String>>();
foreach (var key in a) {
if (key.StartsWith("[HKEY")) {
var iter = a.GetEnumerator();
var value = new List<String>();
do {
iter.MoveNext();
value.Add(iter.Current);
} while (String.IsNullOrWhiteSpace(iter.Current) == false);
dict.Add(key, value);
}
}
I feel like there is a cleaner (prettier?) way to do this in a single Linq statement (using a group by), but it's unclear to me how to implement the iteration of the value items into a list. I suspect I could do the same GetEnumerator in a let statement but it seems like there should be a way to implement this without resorting to an explicit iterator.
Sample data:
[HKEY_LOCAL_MACHINE\SOFTWARE\Classes\.msu]
#="Microsoft.System.Update.1"
[HKEY_LOCAL_MACHINE\SOFTWARE\Classes\.MTS]
#="WMP11.AssocFile.M2TS"
"Content Type"="video/vnd.dlna.mpeg-tts"
"PerceivedType"="video"
[HKEY_LOCAL_MACHINE\SOFTWARE\Classes\.MTS\OpenWithProgIds]
"WMP11.AssocFile.M2TS"=hex(0):
[HKEY_LOCAL_MACHINE\SOFTWARE\Classes\.MTS\ShellEx]
[HKEY_LOCAL_MACHINE\SOFTWARE\Classes\.MTS\ShellEx\{BB2E617C-0920-11D1-9A0B-00C04FC2D6C1}]
#="{9DBD2C50-62AD-11D0-B806-00C04FD706EC}"
Update
I'm sorry I need to be more specific. The files am looking at around ~300MB so I took the approach I did to keep the memory footprint down. I'd prefer an approach that doesn't require pulling the entire file into memory.
You can always use Regex:
var dict = new Dictionary<String, List<String>>();
var a = File.ReadAllText(#"test.reg");
var results = Regex.Matches(a, "(\\[[^\\]]+\\])([^\\[]+)\r\n\r\n", RegexOptions.Singleline);
foreach (Match item in results)
{
dict.Add(
item.Groups[1].Value,
item.Groups[2].Value.Split(new[] { "\r\n" }, StringSplitOptions.RemoveEmptyEntries).ToList()
);
}
I whipped this out real quick. You might be able to improve the regex pattern.
Instead of using GetEnumerator you can take advantage of TakeWhile and Split methods to break your list into smaller list (each sublist represents one key and its values)
var registryLines = File.ReadLines("test.reg");
Dictionary<string, List<string>> resultKeys = new Dictionary<string, List<string>>();
while (registryLines.Count() > 0)
{
// Take the key and values into a single list
var keyValues = registryLines.TakeWhile(x => !String.IsNullOrWhiteSpace(x)).ToList();
// Adds a new entry to the dictionary using the first value as key and the rest of the list as value
if (keyValues != null && keyValues.Count > 0)
resultKeys.Add(keyValues[0], keyValues.Skip(1).ToList());
// Jumps to the next registry (+1 to skip the blank line)
registryLines = registryLines.Skip(keyValues.Count + 1);
}
EDIT based on your update
Update I'm sorry I need to be more specific. The files am looking at
around ~300MB so I took the approach I did to keep the memory
footprint down. I'd prefer an approach that doesn't require pulling
the entire file into memory.
Well, if you can't read the whole file into memory, it makes no sense to me asking for a LINQ solution. Here is a sample of how you can do it reading line by line (still no need for GetEnumerator)
Dictionary<string, List<string>> resultKeys = new Dictionary<string, List<string>>();
using (StreamReader reader = File.OpenText("test.reg"))
{
List<string> keyAndValues = new List<string>();
while (!reader.EndOfStream)
{
string line = reader.ReadLine();
// Adds key and values to a list until it finds a blank line
if (!string.IsNullOrWhiteSpace(line))
keyAndValues.Add(line);
else
{
// Adds a new entry to the dictionary using the first value as key and the rest of the list as value
if (keyAndValues != null && keyAndValues.Count > 0)
resultKeys.Add(keyAndValues[0], keyAndValues.Skip(1).ToList());
// Starts a new Key collection
keyAndValues = new List<string>();
}
}
}
I think you can use a code like this - if you can use memory -:
var lines = File.ReadAllText(fileName);
var result =
Regex.Matches(lines, #"\[(?<key>HKEY[^]]+)\]\s+(?<value>[^[]+)")
.OfType<Match>()
.ToDictionary(k => k.Groups["key"], v => v.Groups["value"].ToString().Trim('\n', '\r', ' '));
C# Demo
This will take 24.173 seconds for a file with more than 4 million lines - Size:~550MB - by using 1.2 GB memory.
Edit :
The best way is using File.ReadAllLines as it is lazy:
var lines = File.ReadAllLines(fileName);
var keyRegex = new Regex(#"\[(?<key>HKEY[^]]+)\]");
var currentKey = string.Empty;
var currentValue = string.Empty;
var result = new Dictionary<string, string>();
foreach (var line in lines)
{
var match = keyRegex.Match(line);
if (match.Length > 0)
{
if (!string.IsNullOrEmpty(currentKey))
{
result.Add(currentKey, currentValue);
currentValue = string.Empty;
}
currentKey = match.Groups["key"].ToString();
}
else
{
currentValue += line;
}
}
This will take 17093 milliseconds for a file with 795180 lines.

How to batch retrieve entities?

In Azure table storage, how can I query for a set of entities that match specific row keys in a partition???
I'm using Azure table storage and need to retrieve a set of entities that match a set of row keys within the partition.
Basically if this were SQL it may look something like this:
SELECT TOP 1 SomeKey
FROM TableName WHERE SomeKey IN (1, 2, 3, 4, 5);
I figured to save on costs and reduce doing a bunch of table retrieve operations that I could just do it using a table batch operation. For some reason I'm getting an exception that says:
"A batch transaction with a retrieve operation cannot contain any other operations"
Here is my code:
public async Task<IList<GalleryPhoto>> GetDomainEntitiesAsync(int someId, IList<Guid> entityIds)
{
try
{
var client = _storageAccount.CreateCloudTableClient();
var table = client.GetTableReference("SomeTable");
var batchOperation = new TableBatchOperation();
var counter = 0;
var myDomainEntities = new List<MyDomainEntity>();
foreach (var id in entityIds)
{
if (counter < 100)
{
batchOperation.Add(TableOperation.Retrieve<MyDomainEntityTableEntity>(someId.ToString(CultureInfo.InvariantCulture), id.ToString()));
++counter;
}
else
{
var batchResults = await table.ExecuteBatchAsync(batchOperation);
var batchResultEntities = batchResults.Select(o => ((MyDomainEntityTableEntity)o.Result).ToMyDomainEntity()).ToList();
myDomainEntities .AddRange(batchResultEntities );
batchOperation.Clear();
counter = 0;
}
}
return myDomainEntities;
}
catch (Exception ex)
{
_logger.Error(ex);
throw;
}
}
How can I achieve what I'm after without manually looping through the set of row keys and doing an individual Retrieve table operation for each one? I don't want to incur the cost associated with doing this since I could have hundreds of row keys that I want to filter on.
I made a helper method to do it in a single request per partition.
Use it like this:
var items = table.RetrieveMany<MyDomainEntity>(partitionKey, nameof(TableEntity.RowKey),
rowKeysList, columnsToSelect);
Here's the helper methods:
public static List<T> RetrieveMany<T>(this CloudTable table, string partitionKey,
string propertyName, IEnumerable<string> valuesRange,
List<string> columnsToSelect = null)
where T : TableEntity, new()
{
var enitites = table.ExecuteQuery(new TableQuery<T>()
.Where(TableQuery.CombineFilters(
TableQuery.GenerateFilterCondition(
nameof(TableEntity.PartitionKey),
QueryComparisons.Equal,
partitionKey),
TableOperators.And,
GenerateIsInRangeFilter(
propertyName,
valuesRange)
))
.Select(columnsToSelect))
.ToList();
return enitites;
}
public static string GenerateIsInRangeFilter(string propertyName,
IEnumerable<string> valuesRange)
{
string finalFilter = valuesRange.NotNull(nameof(valuesRange))
.Distinct()
.Aggregate((string)null, (filterSeed, value) =>
{
string equalsFilter = TableQuery.GenerateFilterCondition(
propertyName,
QueryComparisons.Equal,
value);
return filterSeed == null ?
equalsFilter :
TableQuery.CombineFilters(filterSeed,
TableOperators.Or,
equalsFilter);
});
return finalFilter ?? "";
}
I have tested it for less than 100 values in rowKeysList, however, if it even throws an exception if there are more, we can always split the request into parts.
With hundreds of row keys, that rules out using $filter with a list of row keys (which would result in partial partition scan anyway).
With the error you're getting, it seems like the batch contains both queries and other types of operations (which isn't permitted). I don't see why you're getting that error, from your code snippet.
Your only other option is to execute individual queries. You can do these asynchronously though, so you wouldn't have to wait for each to return. Table storage provides upwards of 2,000 transactions / sec on a given partition, so it's a viable solution.
Not sure how I missed this in the first place, but here is a snippet from the MSDN documentation for the TableBatchOperation type:
A batch operation may contain up to 100 individual table operations, with the requirement that each operation entity must have same partition key. A batch with a retrieve operation cannot contain any other operations. Note that the total payload of a batch operation is limited to 4MB.
I ended up executing individual retrieve operations asynchronously as suggested by David Makogon.
I made my own ghetto link-table. I know it's not that efficient (maybe its fine) but I only make this request if the data is not cached locally, which only means switching devices. Anyway, this seems to work. Checking the length of the two arrays lets me defer the context.done();
var query = new azure.TableQuery()
.top(1000)
.where('PartitionKey eq ?', 'link-' + req.query.email.toLowerCase() );
tableSvc.queryEntities('linkUserMarker',query, null, function(error, result, response) {
if( !error && result ){
var markers = [];
result.entries.forEach(function(e){
tableSvc.retrieveEntity('markerTable', e.markerPartition._, e.RowKey._.toString() , function(error, marker, response){
markers.push( marker );
if( markers.length == result.entries.length ){
context.res = {
status:200,
body:{
status:'error',
markers: markers
}
};
context.done();
}
});
});
} else {
notFound(error);
}
});
I saw your post when I was looking for a solution, in my case I needed to be look up multiple ids at the same time.
Because there is no contains linq support (https://learn.microsoft.com/en-us/rest/api/storageservices/query-operators-supported-for-the-table-service) I just made a massive or equals chain.
Seems to be working for me so far hope it helps anyone.
public async Task<ResponseModel<ICollection<TAppModel>>> ExecuteAsync(
ICollection<Guid> ids,
CancellationToken cancellationToken = default
)
{
if (!ids.Any())
throw new ArgumentOutOfRangeException();
// https://learn.microsoft.com/en-us/rest/api/storageservices/query-operators-supported-for-the-table-service
// Contains not support so make a massive or equals statement...lol
var item = Expression.Parameter(typeof(TTableModel), typeof(TTableModel).FullName);
var expressions = ids
.Select(
id => Expression.Equal(
Expression.Constant(id.ToString()),
Expression.MakeMemberAccess(
Expression.Parameter(typeof(TTableModel), nameof(ITableEntity.RowKey)),
typeof(TTableModel).GetProperty(nameof(ITableEntity.RowKey))
)
)
)
.ToList();
var builderExpression = expressions.First();
builderExpression = expressions
.Skip(1)
.Aggregate(
builderExpression,
Expression.Or
);
var finalExpression = Expression.Lambda<Func<TTableModel, bool>>(builderExpression, item);
var result = await _azureTableService.FindAsync(
finalExpression,
cancellationToken
);
return new(
result.Data?.Select(_ => _mapper.Map<TAppModel>(_)).ToList(),
result.Succeeded,
result.User,
result.Messages.ToArray()
);
}
public async Task<ResponseModel<ICollection<TTableEntity>>> FindAsync(
Expression<Func<TTableEntity,bool>> filter,
CancellationToken ct = default
)
{
try
{
var queryResultsFilter = _tableClient.QueryAsync<TTableEntity>(
FilterExpressionTree(filter),
cancellationToken: ct
);
var items = new List<TTableEntity>();
await foreach (TTableEntity qEntity in queryResultsFilter)
items.Add(qEntity);
return new ResponseModel<ICollection<TTableEntity>>(items);
}
catch (Exception exception)
{
_logger.Error(
nameof(FindAsync),
exception,
exception.Message
);
// OBSFUCATE
// TODO PASS ERROR ID
throw new Exception();
}
}

How do i check if my Linq query produced any result?

Hi guys I am using entity framework, I am facing some problem while checking if my linq returned any results, if it returns any result I want to use it as a data source, the following is the code please have a look:
var dbContext = new DBEntities();
try
{
var linQuery = from cq in dbContext.tblCharacteristics
where cq.CharacteristicID.Equals(combobox1.SelectedText)
select new
{
CharacteristicIDs = cq.CharID,
CharacteristicNames = cq.CharName
};
if (linQuery.Any()) //Also tried with linQuery.Count() != 0
{
lbChaKeyValues.DataSource = linQuery;
lbChaKeyValues.DisplayMember = "CharacteristicNames";
}
}
catch (System.Exception ex)
{
MessageBox.Show(ex.Message);
}
finally
{
dbContext.Dispose();
}
I am getting following error : "DbComparisonExpression requires arguments with comparable types."
IF CharacteristicID is an integer type, the comparison won't work. Instead try
var inputFromUser = Int32.Parse( combobox1.SelectedText );
var linQuery = from cq in dbContext.tblCharacteristics
where cq.CharacteristicID == inputFromUser
select new
{
CharacteristicIDs = cq.CharID,
CharacteristicNames = cq.CharName
};
Incidentally .Any() is the correct way to test for search results. And if you're not going to use the return results, there's no need to project the data into an anonymous type. Just use select true or select cq which allows the optimizer to use the best index in the DB.

Categories

Resources