My problem in the title i have allcodes array and codes TextBox (kodTxtBox)
i will split textbox like line per element and querying all elements with for loop then
when i run it, it shows the query of only the last element of the allcodes array with the
messagebox, but the others go into else and giving error message box
some turkish words in my codes so.
aciklama = description
birim = monad
birimFiyat = Price per 1 unit
ürünler = products
ürünler.sipariskod = products.ordercode etc.
i did a lot of ways for this i used foreach all variables type is string
allCodes = kodTxtBox.Text.Split('\n');
for (int i = 0; i < allCodes.Length; i++)
{
queryString = "SELECT ürünler.siparisKod, ürünler.aciklama, ürünler.birim, ürünler.fGrup, ürünler.birimfiyat FROM ürünler WHERE (((ürünler.siparisKod)=\"" + allCodes[i] + "\"));";
using (OleDbCommand query = new OleDbCommand(queryString))
{
query.Connection = connection;
reader = query.ExecuteReader();
if (reader.Read())
{
MessageBox.Show(allCodes[i] + " Succesful");
var desc = reader["aciklama"].ToString();
var monad = reader["birim"].ToString();
var sellPrice = reader["birimFiyat"].ToString();
MessageBox.Show("Açıklama: " + desc + " Birim: " + monad + " Satış Fiyatı: " + sellPrice);
reader.Close();
}
else
{
MessageBox.Show("Hata");
}
}
}
I solved the problem by making a single query instead of multiple queries. I saved the values returned in each single query into a list and at the end I made the necessary for loop using the elements of the list
Related
I've got a treeview control, which I want to look like this:
Just by messing around with css and a text string, I'm actually pretty close. I just need some help getting over the line.
Here is the code I'm using to generate the treeview:
void FillTree_Parent()
{ // fills the parent view of the Tree Action items
//int RoleID = Convert.ToInt32(ddlRole.SelectedValue);
using (SqlConnection con4 = new SqlConnection(ConfigurationManager.ConnectionStrings["PBRConnectionString"].ConnectionString))
{
try
{
SqlCommand cmd2 = new SqlCommand("SELECT [ACCT_GRP], [ACCT_GRP_PK], [ACTIVE_FLG], [LOAD_BY], [LOAD_TIMESTAMP] FROM [ACCT_GRP_LIST] ORDER BY [ACCT_GRP] ASC", con4);
SqlDataAdapter da = new SqlDataAdapter(cmd2);
DataSet PrSet = new DataSet();
da.Fill(PrSet, "ACCT_GRP");
TreeViewAccts.Nodes.Clear();
foreach (DataRow dr in PrSet.Tables[0].Rows)
{
DateTime date = DateTime.Parse(dr["LOAD_TIMESTAMP"].ToString());
string formatted = date.ToString("MM/dd/yyyy");
TreeNode tnParent = new TreeNode();
// Here is our focus
tnParent.Text = dr["ACCT_GRP"].ToString().Replace("'", "''") +
" ········· " + "Active:" + dr["ACTIVE_FLG"].ToString() +
" ········· " + "Loaded On:" + formatted + "";
//
tnParent.Value = dr["ACCT_GRP_PK"].ToString();
tnParent.PopulateOnDemand = true;
tnParent.SelectAction = TreeNodeSelectAction.SelectExpand;
TreeViewAccts.Nodes.Add(tnParent);
FillTree_Child(tnParent, tnParent.Value);
}
}
catch (Exception ae)
{
Response.Write(ae.Message);
}
}
}
In that block marked "// Here is our focus", what I need to do is figure out how to get that first set of " ········· " to generate a dynamic number of spaces based on the fact that dr["ACCT_GRP"] can have as many as 75 characters. So, I need to determine the length of dr["ACCT_GRP"], subtract that from 75 and then generate that many spaces.
Can anyone help me with this logic? Also, as a bonus question, if anyone could tell me how to use spaces instead of "·"'s I'd appreciate it; whenever I just hit the spacebar a bunch of times and enclose it in quotes, it acts like those spaces don't even exist.
int len = dr["ACCT_GRP"].Length;
int paddingLength = 75 - len;
string padding = new string('.', paddingLength);
I get it from your question that you are viewing this in a browser (you mentioned CSS). HTML spec tells the browser to collapse all consecutive whitespace into a single space. You can use the "non-breaking space" character instead. It may be written as "&nbs p;" in HTML (minus the space between s and p) or using its Unicode representation 00 A0. So your c# code becomes:
int len = dr["ACCT_GRP"].Length;
int paddingLength = 75 - len;
string padding = new string('\u00A0', paddingLength);
I want to use textbox and checkboxlist to search data in gridview using asp.net c#. Using textbox can search the data. But for checkboxlist only can search the data when check only one checkbox. If check more than one checkbox, can't search the data. thanks a lot for helping.
the code:
c# code
protected void btnSearch_Click(object sender, EventArgs e)
{
if (cblDay.SelectedIndex != -1)
{
foreach (ListItem val in cblDay.Items)
{
if (val.Selected == true)
{
RptRateData.Day += val.Value + "";
}
}
}
RptRateData.RateAmount = txtRate.Text.Trim();
BindGrid();
}
code for class:
public string RateAmount { get; set; }
public string Day { get; set; }
internal DataSet GetRptRateSet()
{
DataSet tmpDS = new DataSet();
try
{
string strSQL = #"SELECT ComplexRateInfo.ComplexRateId,
ComplexRateDetailInfo.Day,
ComplexRateInfo.RateAmount
FROM ComplexRateInfo
LEFT JOIN ComplexRateDetailInfo ON ComplexRateInfo.ComplexRateId = ComplexRateDetailInfo.ComplexRateId ";
string whereSQL = " WHERE";
string orderBySQL = " order by Day ;";
int filterCount = 0; //to keep track of needed filter that are going to be used by the sql string
string[] sIndex = new string[2]; //to keep track of scalar variable needed by the sql, four string of sIndex because maximum filter available is 4
int indexCount = 0; //count to access sIndex members
//filter with or without day
if (_ds.Day != null && _ds.Day != "")
{
if (filterCount > 0) //query need additional filter
whereSQL = whereSQL + " AND ComplexRateDetailInfo.Day LIKE '{" + filterCount + "}'";
else //if this is the first filter
whereSQL = whereSQL + " ComplexRateDetailInfo.Day LIKE '{" + filterCount + "}'";
filterCount++;
sIndex[indexCount] = _ds.Day;
indexCount++;
}
//filter with or without rate amount
if (_ds.RateAmount != null && _ds.RateAmount != "")
{
if (filterCount > 0) //query need additional filter
whereSQL = whereSQL + " AND ComplexRateInfo.RateAmount LIKE '{" + filterCount + "}'";
else //if this is the first filter
whereSQL = whereSQL + " ComplexRateInfo.RateAmount LIKE '{" + filterCount + "}'";
filterCount++;
sIndex[indexCount] = _ds.RateAmount;
indexCount++;
}
//build complete query with no filter at all
if (filterCount == 0)
{
strSQL = strSQL + orderBySQL;
tmpDS = Db.GetDataSet(string.Format(strSQL));
}
//build complete query with 1 or more filter
else
{
strSQL = strSQL + whereSQL + orderBySQL;
tmpDS = Db.GetDataSet(string.Format(strSQL, sIndex));
}
}
catch (Exception ex)
{
throw ex;
}
return tmpDS;
}
There are two mistakes in your code.
Assigning values to RptRateData.Day from CheckBoxList.
Description: You assign selected values to object without using any separator. So For example if you have selected 1, 2, 4 days then as per your code, value of RptRateData.Day will be 124. Instead of that, it should be separated with comma as shown below:
var selectedDays = string.Empty;
foreach (ListItem val in cblDay.Items)
{
if (val.Selected == true)
{
selectedDays += "'" + val.Value + "',";
}
}
RptRateData.Day = selectedDays.TrimEnd(new char[] { ',' });
Now come to the second point which is in your SQL query which you make dynamically.
Description: In this query in WHERE clause you use Like operator for ComplexRateDetailInfo.Day. This will not work anymore. Instead of that you should use IN operator.
Note: Are you sure that your Like operator is working with curly braces ('{' and '}') and without '%' symbol ?
I am a bit in a pickle regarding a consolidation application we are using in our company. We create a csv file from an progress database this csv file has 14 columns and NO header.
The CSV file contains payments (around 173 thousand rows). Most of these rows are the same except for the column amount (last column)
Example:
2014;MONTH;;SC;10110;;;;;;;;EUR;-6500000
2014;01;;SC;10110;;;;;;;;EUR;-1010665
2014;01;;LLC;11110;;;;;;;;EUR;-6567000
2014;01;;SC;10110;;;;;;;;EUR;-1110665
2014;01;;LLC;11110;;;;;;;;EUR;65670.00
2014;01;;SC;10110;;;;;;;;EUR;-11146.65
(around 174000 rows)
As you can see some of these lines are the same except for the amount column. What i need is to sort all rows, add up the amount and save one unique row instead of 1100 rows with different amounts.
My coding skills are failing me to get the job done within a certain timeframe, maybe one of you can push me in the right direction solving this problem.
Example code
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.IO;
namespace ConsoleApplication1
{
class Program
{
static void Main(string[] args)
{
string input = File.ReadAllText(#"c:\temp\test.txt");
string inputLine = "";
StringReader reader = new StringReader(input);
List<List<string>> data = new List<List<string>>();
while ((inputLine = reader.ReadLine()) != null)
{
if (inputLine.Trim().Length > 0)
{
string[] inputArray = inputLine.Split(new char[] { ';' });
data.Add(inputArray.ToList());
}
}
//sort data by every column
for (int sortCol = data[0].Count() - 1; sortCol >= 0; sortCol--)
{
data.OrderBy(x => x[sortCol]);
}
//delete duplicate rows
for (int rowCount = data.Count - 1; rowCount >= 1; rowCount--)
{
Boolean match = true;
for (int colCount = 0; colCount < data[rowCount].Count - 2; colCount++)
{
if(data[rowCount][colCount] != data[rowCount - 1][colCount])
{
match = false;
break;
}
}
if (match == true)
{
decimal previousValue = decimal.Parse(data[rowCount - 1][data[rowCount].Count - 1]);
decimal currentValue = decimal.Parse(data[rowCount][data[rowCount].Count - 1]);
string newStrValue = (previousValue + currentValue).ToString();
data[rowCount - 1][data[rowCount].Count - 1] = newStrValue;
data.RemoveAt(rowCount);
}
}
string output = string.Join("\r\n",data.AsEnumerable()
.Select(x => string.Join(";",x.Select(y => y).ToArray())).ToArray());
File.WriteAllText(#"c:\temp\test1.txt",output);
}
}
}
Read the CSV file line by line, and build an in-memory dictionary in which you keep the totals (and other information you require). As most of the lines belong to the same key, it will probably not cause out of memory issues. Afterwards, generate a new CSV based on the information in the dictionary.
As I interpret your question, your problem and the solution you are asking for are how to take your input that are in the form of
#"2014;MONTH;;SC;10110;;;;;;;;EUR;-6500000
2014;01;;SC;10110;;;;;;;;EUR;-1010665
2014;01;;LLC;11110;;;;;;;;EUR;-6567000
2014;01;;SC;10110;;;;;;;;EUR;-1110665
2014;01;;LLC;11110;;;;;;;;EUR;65670.00
2014;01;;SC;10110;;;;;;;;EUR;-11146.65"
Get the last column and then sum it up? If so this is actually very easy to do with something like this
public static void Main()
{
string input = #"2014;MONTH;;SC;10110;;;;;;;;EUR;-6500000
2014;01;;SC;10110;;;;;;;;EUR;-1010665
2014;01;;LLC;11110;;;;;;;;EUR;-6567000
2014;01;;SC;10110;;;;;;;;EUR;-1110665
2014;01;;LLC;11110;;;;;;;;EUR;65670.00
2014;01;;SC;10110;;;;;;;;EUR;-11146.65";
var rows = input.Split('\n');
decimal totalValue = 0m;
foreach(var row in rows)
{
var transaction = row.Substring(row.LastIndexOf(';') +1);
decimal val = 0m;
if(decimal.TryParse(transaction, out val))
totalValue += val;
}
Console.WriteLine(totalValue);
}
But maybe I have misunderstood what you were asking for?
Sorry answering my post so late but this is my final solution
Replacing all " characters and write the output to the stream writer. (going from 25mb to a 15mb file.). Than copy my CSV file to the SQL server so i can bulk insert. After my insert i just query the table and read / write the result set to a new file. My new file is only +/-700KB!
The Filldata() method is filling a datagridview in my application so you can review the result instead of opening the file in excel.
I am new with C#, i am currently writing a new solution to query the csv file directly or in memory and write it back to a new file.
Method1:
string line;
StreamWriter sw = new StreamWriter(insertFile);
using (StreamReader sr = new StreamReader(sourcePath))
{
while ((line = sr.ReadLine()) != null)
{
sw.WriteLine(line.Replace("\"", ""));
}
sr.Close();
sw.Close();
sr.Dispose();
sw.Dispose();
File.Copy(insertFile, #"\\SQLSERVER\C$\insert.csv");
}
Method2:
var destinationFile = #"c:\insert.csv";
var querieImportCSV = "BULK INSERT dbo.TABLE FROM '" + destinationFile + "' WITH ( FIELDTERMINATOR = ';', ROWTERMINATOR = '\n', FIRSTROW = 1)";
var truncate = #"TRUNCATE TABLE dbo.TABLE";
string queryResult =
#"SELECT [Year]
,[Month]
,[Week]
,[Entity]
,[Account]
,[C11]
,[C12]
,[C21]
,[C22]
,[C3]
,[C4]
,[CTP]
,[VALUTA]
,SUM(AMOUNT) as AMOUNT
,[CURRENCY_ORIG]
,[AMOUNTEXCH]
,[AGENTCODE]
FROM dbo.TABLE
GROUP BY YEAR, MONTH, WEEK, Entity, Account, C11, C12, C21, C22, C3, C4, CTP, VALUTA, CURRENCY_ORIG, AMOUNTEXCH, AGENTCODE
ORDER BY Account";
var conn = new SqlConnection(connectionString);
conn.Open();
SqlCommand commandTruncate = new SqlCommand(truncate, conn);
commandTruncate.ExecuteNonQuery();
SqlCommand commandInsert = new SqlCommand(querieImportCSV, conn);
SqlDataReader readerInsert = commandInsert.ExecuteReader();
readerInsert.Close();
FillData();
SqlCommand commandResult = new SqlCommand(queryResult, conn);
SqlDataReader readerResult = commandResult.ExecuteReader();
StringBuilder sb = new StringBuilder();
while (readerResult.Read())
{
sb.Append(readerResult["Year"] + ";" + readerResult["Month"] + ";" + readerResult["Week"] + ";" + readerResult["Entity"] + ";" + readerResult["Account"] + ";" +
readerResult["C11"] + ";" + readerResult["C12"] + ";" + readerResult["C21"] + ";" + readerResult["C22"] + ";" + readerResult["C3"] + ";" + readerResult["C4"] + ";" +
readerResult["CTP"] + ";" + readerResult["Valuta"] + ";" + readerResult["Amount"] + ";" + readerResult["CURRENCY_ORIG"] + ";" + readerResult["AMOUNTEXCH"] + ";" + readerResult["AGENTCODE"]);
}
sb.Replace("\"","");
StreamWriter sw = new StreamWriter(homedrive);
sw.WriteLine(sb);
readerResult.Close();
conn.Close();
sw.Close();
sw.Dispose();
I have written the following program which connects to two LDAP stores, compares the attributes and based on the outcome, creates a new csv file. I have run into a problem though.
Here is the code:
//Define LDAP Connection
string username = "****";
string password = "*****";
string domain = "LDAP://****";
//Define LDAP Connection
string ABSAusername = "****";
string ABSApassword = "****";
string ABSAdomain = "LDAP://****";
//Create Directory Searcher
DirectoryEntry ldapConnection = new DirectoryEntry(domain,username,password);
ldapConnection.AuthenticationType = AuthenticationTypes.Anonymous;
DirectorySearcher ds = new DirectorySearcher(ldapConnection);
ds.Filter = "((EmploymentStatus=0))";
ds.SearchScope = System.DirectoryServices.SearchScope.Subtree;
//Create Directory Searcher
DirectoryEntry ABSAldapConnection = new DirectoryEntry(ABSAdomain, ABSAusername, ABSApassword);
ABSAldapConnection.AuthenticationType = AuthenticationTypes.Anonymous;
DirectorySearcher ABSAds = new DirectorySearcher(ABSAldapConnection);
ABSAds.Filter = "((&(EmploymentStatus=3)(EmploymentStatusDescription=Active))";
ABSAds.SearchScope = System.DirectoryServices.SearchScope.Subtree;
ds.PropertiesToLoad.Add("cn");
ds.PropertiesToLoad.Add ("uid");
ds.PropertiesToLoad.Add("sn");
ds.PropertiesToLoad.Add("PersonnelAreaDesc");
ds.PropertiesToLoad.Add("JobcodeID");
ds.PropertiesToLoad.Add("CostCentreID");
ds.PropertiesToLoad.Add("CostCentreDescription");
ds.PropertiesToLoad.Add ("givenName");
ds.PropertiesToLoad.Add ("EmploymentStatus");
ds.PropertiesToLoad.Add("EmploymentStatusDescription");
ABSAds.PropertiesToLoad.Add("uid");
ABSAds.PropertiesToLoad.Add("EmploymentStatus");
ABSAds.Sort = new SortOption("uid", SortDirection.Ascending);
ds.Sort = new SortOption("cn", SortDirection.Ascending);
SearchResultCollection absaUsers = ds.FindAll();
SearchResultCollection srcUsers = ds.FindAll();
sw.WriteLine("Action" + "," + "uid" + "," + "Business Area" + "," + "employeeNumber" + "," + "First Name" + "," + "Last Name" + "," + "JobCodeID" + "," + "costCentreID" + "," + "costCentreDescription" + "," + "FullName" + "," + "EmploymentStatus" + "," + "EmploymentStatusDescription" );
sw.WriteLine("");
foreach (SearchResult users in srcUsers)
{
string cn = users.Properties["cn"][0].ToString();
string sn = users.Properties["sn"][0].ToString();
string userID = users.Properties["uid"][0].ToString();
string description = users.Properties["PersonnelAreaDesc"][0].ToString();
// string jobCodeID = users.Properties["JobcodeID"][1].ToString();
string CostCentreID = users.Properties["costCentreID"][0].ToString();
string CostCentreDescription = users.Properties["CostCentreDescription"][0].ToString();
string givenName = users.Properties["givenName"][0].ToString();
string employmentStatus = users.Properties["EmploymentStatus"][0].ToString();
string EmploymentStatusDescription = users.Properties["EmploymentStatusDescription"][0].ToString();
foreach (SearchResult absaUser in absaUsers)
{
string absaUID = absaUser.Properties["uid"][0].ToString();
string absaEmploymentStatus = absaUser.Properties["EmploymentStatus"][0].ToString();
if (cn == absaUID)
{
if (absaEmploymentStatus == "3")
{
sw.WriteLine(cn);
}
}
}
}
sw.Flush();
sw.Close();
sw.Dispose();
}
}
I created two foreach loops, in the first loop i assigned variables to strings, and in the second foreach loop i do a comparison with an IF statement. What i want to do is: if the uid in one LDAP is equal to the uid in the other ldap, and if the status of the user in the 1st ldap = 0 but the status of the user in the 2nd ldap = 3: then i want to print out the users that match that criteria from the 1st ldap.
If you look through my code, am i doing somthing wrong? The output of the program currently is about 10 users that are duplicated atleast 100 times each.
Thanks in advance.
There are several things obviously wrong with this code....
1) First of all, you're creating the same search result twice:
SearchResultCollection absaUsers = ds.FindAll();
SearchResultCollection srcUsers = ds.FindAll();
So if that searcher finds 10 users, you have two collections of the same 10 users here.
2) You then used nested foreach loops, so you basically go through all results from the first collection, one by one, and for each of those entries, you enumerate the entire second collection - basically doing a cartesian product between the two collections. So that's why you end up with 10 x 10 = 100 users in the end.....
3) you seem to be missing some kind of a restriction/condition to pick out only those elements from your second/inner result set that match the outer/first result set in some way. As long as you don't have such a condition in place, you'll always get all the results from the second result set for each element of the first result set = a classic cartesian product. Somehow, you want to select only certain elements from the second set, based on something from the first result set......
You have missed break in the following place:
if (cn == absaUID && absaEmploymentStatus == "3")
{
sw.WriteLine(cn);
break;
}
Is it possible with LINQ to SQL to search the entire database (obviously only the parts that are mapped in the .dbml file) for a string match? I'm trying to write a function that will take a string of "Search Term" and search all mapped entities and return a List(Of Object) that can contain a mixture of entities i.e. if I have a table "Foo" and table "Bar" and search for "wibble", if there is a row in "Foo" and one in "Bar" that contain "wibble" i would like to return a List(Of Object) that contains a "Foo" object and a "Bar" object.
Is this possible?
Ask your boss the following:
"Boss, when you go to the library to find a book about widgets, do you walk up to the first shelf and start reading every book to see if it is relevant, or do you use some sort of pre-compiled index that the librarian has helpfully configured for you, ahead of time?"
If he says "Well, I would use the index" then you need a Full Text index.
If he says "Well, I would start reading every book, one by one" then you need a new job, a new boss, or both :-)
LINQ to SQL, ORMs in general, even SQL is a bad match for such a query. You are describing a full-text search so you should use SQL Server's full text search functionality. Full Text Search is available in all versions and editions since 2000, including SQL Server Express. You need to create an FTS catalog and write queries that use the CONTAINS, FREETEXT functions in your queries.
Why do you need such functionality? Unless you specifically want to FTS-enable your application, this is a ... strange ... way to access your data.
It's probably 'possible', but most databases are accessed through web or network, so its a very expensive operation. So it sounds like bad design.
Also there is the problem of table and column names, this is probably your biggest problem. It's possible to get the column names through reflection, but I don't know for table names:
foreach (PropertyInfo property in typeof(TEntity).GetProperties())
yield return property.Name;
edit: #Ben, you'r right my mistake.
This can be done but will not be pretty. There are several possible solutions.
1. Write the queries for every table yourself and execute them all in your query method.
var users = context.Users
.Where(x => x.FirstName.Contains(txt) || x.LastName.Contains(txt))
.ToList();
var products = context.Products
.Where(x => x.ProductName.Contains(txt));
var result = user.Cast<Object>().Concat(products.Cast<Object>());
2. Fetch all (relevant) tables into memory and perform the search using reflection. Less code to write payed with a huge performance impact.
3. Build the expression trees for the searches using reflection. This is probably the best solution but it is probably challenging to realize.
4. Use something designed for full-text search - for example full-text search integrated into SQL Server or Apache Lucene.
All LINQ solution will (probably) require one query per table which imposes a non-negligible performance impact if you have many tables. Here one should look for a solution to batch this queries into a single one. One of our projects using LINQ to SQL used a library for batching queries but I don't know what it name was and what exactly it could do because I worked most of the time in the front-end team.
Possible but from my point of view, it is not recommended. Consider having 1000K of records of 100 of tables. Slow performance you can do that by Linq to SQL by making a Sp at database level and calling through entities. It will be much faster then the one you trying to achieve =)
Late answer, but since I just had to come up with something for myself, here goes. I wrote the following to search all columns of all tables for a string match. This is related to a data forensics task that was given to me to find all occurences of a string match in a database weighing around 24GB. At this size, you can imagine using cursors or single threaded queries will be rather slow and searching the entire database would take ages. I wrote the following CLR stored procedure to do the work for me server side and return results in XML, while forcing parallelization. It is impressively fast. A database-wide search on the standard AdventureWorks2017 database completes in less than 2 seconds. Enjoy!
Example usages:
Using all available processors on the server:
EXEC [dbo].[SearchAllTables] #valueSearchTerm = 'john michael'
Limiting the server to 4 concurrent threads:
EXEC [dbo].[SearchAllTables] #valueSearchTerm = 'john michael', #maxDegreeOfParallelism = 4
Using logical operators in search terms:
EXEC [dbo].[SearchAllTables] #valueSearchTerm = '(john or michael) and not jack', #tablesSearchTerm = 'not contact'
Limiting search to table names and/or column names containing some search terms:
EXEC [dbo].[SearchAllTables] #valueSearchTerm = 'john michael', #tablesSearchTerm = 'person contact', #columnsSearchTerm = 'address name'
Limiting search results to the first row of each table where the terms are found:
EXEC [dbo].[SearchAllTables] #valueSearchTerm = 'john michael', #getOnlyFirstRowPerTable = 1
Limiting the search to the schema only automatically returns only the first row for each table:
EXEC [dbo].[SearchAllTables] #tablesSearchTerm = 'person contact'
Only return the search queries:
EXEC [dbo].[SearchAllTables] #valueSearchTerm = 'john michael', #tablesSearchTerm = 'person contact', #onlyOutputQueries = 1
Capturing results into temporary table and sorting:
CREATE TABLE #temp (Result NVARCHAR(MAX));
INSERT INTO #temp
EXEC [dbo].[SearchAllTables] #valueSearchTerm = 'john';
SELECT * FROM #temp ORDER BY Result ASC;
DROP TABLE #temp;
https://pastebin.com/RRTrt8ZN
I ended up writing this little custom Gem (finds all matching records given a search term):
namespace SqlServerMetaSearchScan
{
using Newtonsoft.Json;
using System;
using System.Collections.Generic;
using System.Data;
using System.Data.SqlClient;
using System.IO;
using System.Linq;
using System.Security.Cryptography;
using System.Text;
using System.Threading;
using System.Xml;
public class Program
{
#region Ignition
public static void Main(string[] args)
{
// Defaulting
SqlConnection connection = null;
try
{
// Questions
ColorConsole.Print("SQL Connection String> ");
string connectionString = Console.ReadLine();
ColorConsole.Print("Search Term (Case Ignored)> ");
string searchTerm = Console.ReadLine();
ColorConsole.Print("Skip Databases (Comma Delimited)> ");
List<string> skipDatabases = Console.ReadLine().Split(',').Where(item => item.Trim() != string.Empty).ToList();
// Search
connection = new SqlConnection(connectionString);
connection.Open();
// Each database
List<string> databases = new List<string>();
string databasesLookup = "SELECT name FROM master.dbo.sysdatabases";
SqlDataReader reader = new SqlCommand(databasesLookup, connection).ExecuteReader();
while (reader.Read())
{
// Capture
databases.Add(reader.GetValue(0).ToString());
}
// Build quintessential folder
string logsDirectory = #"E:\Logs";
if (!Directory.Exists(logsDirectory))
{
// Build
Directory.CreateDirectory(logsDirectory);
}
string baseFolder = #"E:\Logs\SqlMetaProbeResults";
if (!Directory.Exists(baseFolder))
{
// Build
Directory.CreateDirectory(baseFolder);
}
// Close reader
reader.Close();
// Sort databases
databases.Sort();
// New space
Console.WriteLine(Environment.NewLine + " Found " + databases.Count + " Database(s) to Scan" + Environment.NewLine);
// Deep scan
foreach (string databaseName in databases)
{
// Skip skip databases
if (skipDatabases.Contains(databaseName))
{
// Skip
continue;
}
// Select the database
new SqlCommand("USE " + databaseName, connection).ExecuteNonQuery();
// Table count
int tablePosition = 1;
try
{
// Defaulting
List<string> tableNames = new List<string>();
// Schema examination
DataTable table = connection.GetSchema("Tables");
// Query tables
string tablesLookup = "SELECT TABLE_NAME FROM INFORMATION_SCHEMA.TABLES";
using (SqlDataReader databaseReader = new SqlCommand(tablesLookup, connection).ExecuteReader())
{
// Get data
while (databaseReader.Read())
{
// Push
if (databaseReader.GetValue(0).ToString().Trim() != string.Empty)
{
tableNames.Add(databaseReader.GetValue(0).ToString());
}
}
// Bail
databaseReader.Close();
}
// Sort
tableNames.Sort();
// Cycle tables
foreach (string tableName in tableNames)
{
// Build data housing
string databasePathName = #"E:\Logs\\SqlMetaProbeResults" + databaseName;
string tableDirectoryPath = #"E:\Logs\SqlMetaProbeResults\" + databaseName + #"\" + tableName;
// Count first
int totalEntityCount = 0;
int currentEntityPosition = 0;
string countQuery = "SELECT count(*) FROM " + databaseName + ".dbo." + tableName;
using (SqlDataReader entityCountReader = new SqlCommand(countQuery, connection).ExecuteReader())
{
// Query count
while (entityCountReader.Read())
{
// Capture
totalEntityCount = int.Parse(entityCountReader.GetValue(0).ToString());
}
// Close
entityCountReader.Close();
}
// Write the objects into the houseing
string jsonLookupQuery = "SELECT * FROM " + databaseName + ".dbo." + tableName;
using (SqlDataReader tableReader = new SqlCommand(jsonLookupQuery, connection).ExecuteReader())
{
// Defaulting
List<string> fieldValueListing = new List<string>();
// Read continue
while (tableReader.Read())
{
// Increment
currentEntityPosition++;
// Defaulting
string identity = null;
// Gather data
for (int i = 0; i < tableReader.FieldCount; i++)
{
// Set
if (tableReader.GetName(i).ToUpper() == "ID")
{
identity = tableReader.GetValue(0).ToString();
}
else
{
// Build column data entry
string thisColumn = tableReader.GetValue(i) != null ? "'" + tableReader.GetValue(i).ToString().Trim() + "'" : string.Empty;
// Piece
fieldValueListing.Add(thisColumn);
}
}
// Path-centric
string explicitIdentity = identity ?? Guid.NewGuid().ToString().Replace("-", string.Empty).ToLower();
string filePath = tableDirectoryPath + #"\" + "Obj." + explicitIdentity + ".json";
string reStringed = JsonConvert.SerializeObject(fieldValueListing, Newtonsoft.Json.Formatting.Indented);
string percentageMark = ((double)tablePosition / (double)tableNames.Count * 100).ToString("#00.0") + "%";
string thisMarker = Guid.NewGuid().ToString().Replace("-", string.Empty).ToLower();
string entityPercentMark = string.Empty;
if (totalEntityCount != 0 && currentEntityPosition != 0)
{
// Percent mark
entityPercentMark = ((double)currentEntityPosition / (double)totalEntityCount * 100).ToString("#00.0") + "%";
}
// Search term verify
if (searchTerm.Trim() != string.Empty)
{
// Search term scenario
if (reStringed.ToLower().Trim().Contains(searchTerm.ToLower().Trim()))
{
// Lazy build
if (!Directory.Exists(tableDirectoryPath))
{
// Build
Directory.CreateDirectory(tableDirectoryPath);
}
// Has the term
string idMolding = identity == null || identity == string.Empty ? "No Identity" : identity;
File.WriteAllText(filePath, reStringed);
ColorConsole.Print(percentageMark + " => " + databaseName + "." + tableName + "." + idMolding + "." + thisMarker + " (" + entityPercentMark + ")", ConsoleColor.Green, ConsoleColor.Black, true);
}
else
{
// Show progress
string idMolding = identity == null || identity == string.Empty ? "No Identity" : identity;
ColorConsole.Print(percentageMark + " => " + databaseName + "." + tableName + "." + idMolding + "." + thisMarker + " (" + entityPercentMark + ")", ConsoleColor.Yellow, ConsoleColor.Black, true);
}
}
}
// Close
tableReader.Close();
}
// Increment
tablePosition++;
}
}
catch (Exception err)
{
ColorConsole.Print("DB.Tables!: " + err.Message, ConsoleColor.Red, ConsoleColor.White, false);
}
}
}
catch (Exception err)
{
ColorConsole.Print("KABOOM!: " + err.ToString(), ConsoleColor.Red, ConsoleColor.White, false);
}
finally
{
try { connection.Close(); }
catch { }
}
// Await
ColorConsole.Print("Done.");
Console.ReadLine();
}
#endregion
#region Cores
public static string GenerateHash(string inputString)
{
// Defaulting
string calculatedChecksum = null;
// Calculate
SHA256Managed checksumBuilder = new SHA256Managed();
string hashString = string.Empty;
byte[] hashBytes = checksumBuilder.ComputeHash(Encoding.ASCII.GetBytes(inputString));
foreach (byte theByte in hashBytes)
{
hashString += theByte.ToString("x2");
}
calculatedChecksum = hashString;
// Return
return calculatedChecksum;
}
#endregion
#region Colors
public class ColorConsole
{
#region Defaulting
public static ConsoleColor DefaultBackground = ConsoleColor.DarkBlue;
public static ConsoleColor DefaultForeground = ConsoleColor.Yellow;
public static string DefaultBackPorch = " ";
#endregion
#region Printer Cores
public static void Print(string phrase)
{
// Use primary
Print(phrase, DefaultForeground, DefaultBackground, false);
}
public static void Print(string phrase, ConsoleColor customForecolor)
{
// Use primary
Print(phrase, customForecolor, DefaultBackground, false);
}
public static void Print(string phrase, ConsoleColor customBackcolor, bool inPlace)
{
// Use primary
Print(phrase, DefaultForeground, customBackcolor, inPlace);
}
public static void Print(string phrase, ConsoleColor customForecolor, ConsoleColor customBackcolor)
{
// Use primary
Print(phrase, customForecolor, customBackcolor, false);
}
public static void Print(string phrase, ConsoleColor customForecolor, ConsoleColor customBackcolor, bool inPlace)
{
// Capture settings
ConsoleColor captureForeground = Console.ForegroundColor;
ConsoleColor captureBackground = Console.BackgroundColor;
// Change colors
Console.ForegroundColor = customForecolor;
Console.BackgroundColor = customBackcolor;
// Write
if (inPlace)
{
// From beginning of this line + padding
Console.Write("\r" + phrase + DefaultBackPorch);
}
else
{
// Normal write
Console.Write(phrase);
}
// Revert
Console.ForegroundColor = captureForeground;
Console.BackgroundColor = captureBackground;
}
#endregion
}
#endregion
}
}