Im sorry if this is a duplicate question, but i did a search and was unable to find info on what i was looking for. If you know of a qusetion to refer too, please link me!
But anyways, i have a function creating a class
private Item CreateItem(string name, bool stackable, int amount, string discription)
{
Item item = new Item(name, stackable, amount, discription);
return item;
}
Then i have another function that finds the stats
private Item findItemStats(string name)
{
if (name == "Gold")
return CreateItem(name, false, 0, "Gold Bar");
return null;
}
This is what im using to add the item too the inventory
internal void addItem(string name)
{
var item = findItemStats(name);
if (item == null)
Debug.LogError("Item not found!");
Instance.itemsToAdd.Add(item);
if (!inventory())
return;
if (inventory().activeInHierarchy)
{
placeItemsOnInventory();
sortItems();
}
My question is, whats a better way to store and retrieve the data of item stats. I at one point hosted a private server and on that, the item stats were stored in a .txt (or json w/e) and then would have a class for taking that data and placing it to the item that was being called. Was just curious of a way to either do that, or a way to store the data in a separate class/file with easy access and placement of the item data.
This can be a fairly wide open topic and depends on your needs. The simplest option if you are just saving something locally is using PlayerPrefs
PlayerPrefs Example:
PlayerPrefs.SetInt("Player Score", 10);
PlayerPrefs.Save();
//And to fetch:
var playerScore = PlayerPrefs.GetInt("Player Score");
More on using PlayerPrefs
Serialization Example Snippet.
For something more complex you can serialize your data to a data format such as XML, JSON, binary, CSV or any data that you want to import.This is an example of binary.
public void SaveData()
{
if (!Directory.Exists("Saves"))
Directory.CreateDirectory("Saves");
BinaryFormatter formatter = new BinaryFormatter();
FileStream saveFile = File.Create("Saves/save.binary");
LocalCopyOfData = PlayerState.Instance.localPlayerData;
formatter.Serialize(saveFile, LocalCopyOfData);
saveFile.Close();
}
public void LoadData()
{
BinaryFormatter formatter = new BinaryFormatter();
FileStream saveFile = File.Open("Saves/save.binary", FileMode.Open);
LocalCopyOfData = (PlayerStatistics)formatter.Deserialize(saveFile);
saveFile.Close();
}
More on Saving and Loading player data
SqlLite
Alternatively you can use tooling for integrating a sqlite db into your project. The code for this looks like a standard db connection in .net.
string conn = "URI=file:" + Application.dataPath + "/PickAndPlaceDatabase.s3db"; //Path to database.
IDbConnection dbconn;
dbconn = (IDbConnection) new SqliteConnection(conn);
dbconn.Open(); //Open connection to the database.
IDbCommand dbcmd = dbconn.CreateCommand();
string sqlQuery = "SELECT value,name, randomSequence " + "FROM PlaceSequence";
dbcmd.CommandText = sqlQuery;
IDataReader reader = dbcmd.ExecuteReader();
How to Setup Sqlite withn Unity3d.
Cloud Hosting
For data that needs to be persist and be made available across multiple machines. You may want to consider hosting your data on a proper database or cloud hosted data store service. Some examples:
Unity Cloud Data is in alpha(As of 7/10/2016)
Firebase(Fun fact:Firebase was originally concieved to be a chat server tool for mmo's)
Play Fab
Game Sparks
Amazon RDS
Google Cloud Datastore(MySql)
Google Cloud Database(NoSql)
Azure Db
back4app (thanks #Joe Blow)
Other Data Storage options
Googling Backend as a service yields lots of other goodies as well. Sky's the limit!~
Unity has Scriptable objects that can be used to store data, and the objects get stored within the assets folder so easily accessible.
https://unity3d.com/learn/tutorials/modules/beginner/live-training-archive/scriptable-objects
Related
I have an old application that I wrote in Access VBA, the time has come to upgrade the code and the company decided to go with C# since we use it the most. My question is following, I have this code in VBA that works great,
Set RS2 = Db.OpenRecordset("Select * FROM TTable WHERE ID="&Forms![test]![SifraFirme]&")
su = RS2.RecordCount
RS2.MoveFirst
Do While Not RS2.EOF
//lines of code
RS3.MoveNext
Loop
RS3.Close
Now my question is, is there a C# command similar to Do While Not RS.EOF, any literature or examples would be highly appreciated. Just a nudge in the right direction because it has become frustrating. The main point of code above is to go through the table and filter the data and write it to XML (predefined structure) based on ID once he is done with first, move on to the second, and ...
Thank you,
Answering to:
The main point of code above is to go through the table and filter the
data and write it to XML
You can read database table to some DataSet, using OleDbDataAdapter from System.Data namespace. Then easily work with filled DataSet or instantly get its XML representation by GetXml method:
static void Main(string[] args)
{
// Note about set Prefer 32-bit app version of your C# app to use Jet.OLEDB provider
var connectionString = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=YourDBPath";
var query = "Select * FROM TTable";
// Introducing our DataSet
var dataSet = new System.Data.DataSet();
using (var connection = new System.Data.OleDb.OleDbConnection(connectionString))
{
var command = new System.Data.OleDb.OleDbCommand(query, connection);
try
{
connection.Open();
using (var dataAdapter = new System.Data.OleDb.OleDbDataAdapter(command))
{
// Fill DataSet
dataAdapter.Fill(dataSet);
}
// Get XML representation of DataSet and save to XML file
System.IO.File.WriteAllText(#"TTable.xml", dataSet.GetXml());
// Or if need to filter data before save - read through DataSet
var TTable = dataSet.Tables["TTable"];
foreach (var row in TTable.Rows.Cast<System.Data.DataRow>().ToArray()) // using System.Linq needed
{
}
}
catch (System.Exception ex)
{
// Handle exception in some way
System.Console.WriteLine(ex.Message);
}
}
System.Console.ReadKey();
}
C# has the XMLWriter class and you can use the SQL classes for querying and reading the information.
The while loop in C# would be something like this:
while (!RS2.EOF)
{
//lines of code
RS2.MoveNext();
}
The ! is the Logical negation operator.
ADO.NET has the DataSet class which works with data in a way that is similar to a RecordSet in VBA.
See Microsoft's documentation on DataSet
I'm completely new to C# programming and I'm trying to learn on my own. Currently I'm building a mini-project to exercise.
I understand that the user layer should not have any data query for security reasons perhaps?
So I have created a separate Data Access class to retrieve data. This is what my data access class looks like(I'll be using stored procedures for better security once I learn how to use it):
public class DataAccess
{
public List<Customer> FilteredCustomersList(string name)
{
using (IDbConnection connection = new MySql.Data.MySqlClient.MySqlConnection(Helper.CnnVal("FineCreteDB")))
{
var output = connection.Query<Customer>($"SELECT * from `Customers` WHERE `Cust_Name` LIKE '{name}'").ToList();
return output;
}
}
Basically I send over a string from the user form to query the database, the data is retrieved and stored in a list. User form:
private void RetrieveData()
{
try
{
DataAccess db = new DataAccess();
filteredcustomers = db.FilteredCustomersList(CustomerNameTxtBox_AutoComplete.Text);
ntn_num = filteredcustomers.Select(x => x.Cust_NTN).ElementAt(0);
strn_num = filteredcustomers.Select(x => x.Cust_STRN).ElementAt(0);
address = filteredcustomers.Select(x => x.Cust_Address).ElementAt(0);
phone_num = filteredcustomers.Select(x => x.Cust_Phone).ElementAt(0);
id_num = filteredcustomers.Select(x => x.Cust_ID).ElementAt(0);
}
catch (Exception)
{
MessageBox.Show("Customer not found. If customer was recently added, try updating DB.", "Error", MessageBoxButtons.OK, MessageBoxIcon.Exclamation);
DataAccess db = new DataAccess();
filteredcustomers = db.AllCustomersList();
ntn_num = "";
strn_num = "";
address = "";
phone_num = "";
}
}
On the user form side, "filteredcustomers" holds the list of data sent back, now here is the problem: I use the filteredcustomers list to extract the different column values like so:
address = filteredcustomers.Select(x => x.Cust_Address).ElementAt(0);
and then use them to populate the respective textboxes like:
Address_TxtBox.Text = address;
Everything works fine, but I don't want the userform to have these queries for all individual columns, because from what I've understood so far, this is bad programming and bad for security as well.
Can anyone guide me how I can keep the values in Data Access layer and just call them into my form?
I'm sorry if this is a long post, I'm just learning and wanted to be as detailed as possible.
You're already doing everything reasonably correctly as per how Dapper is to be used. Dapper doesn't maintain a local graph of entities from the db, track changes to it and automatically save them. If you want that, use something like EF
For dapper you retrieve data with a SELECT and send it back with an UPDATE
If you're only expecting one Customer for the name, do this:
var output = connection.QueryFirstOrDefault<Customer>($"SELECT * from `Customers` WHERE `Cust_Name` LIKE #n", new { n = name });
https://dapper-tutorial.net/queryfirst
This will return just one customer instance (or null; check it!) meaning you can tidy up your form code to:
c = db.FilteredCustomer(CustomerNameTxtBox_AutoComplete.Text);
ntn_num = c?.Cust_NTN;
strn_num = c?.Cust_STRN;
And so on
Your "if customer was recently added try updating db" doesn't really make sense- the query is done live, so the db is about as up to date as it can be
So I have this code that checks if new data is added to online database by comparing the rows of online and local database. If new data is found it inserts the new data to local database.
public class Reservation
{
public string res_no { get; set; }
public string mem_fname { get; set; }
}
My Code :
private async void updateDineList()
{
DBconnector.OpenConnection();
//Gets data from online database
HttpClient client = new HttpClient();
var response = await client.GetStringAsync("http://example.com/Reservation/view_pending_reservation");
var persons = JsonConvert.DeserializeObject<List<Reservation>>(response);
//Gets data from Local database
string string_reservation = "SELECT res_no,mem_fname FROM res_no WHERE res_status='pending';";
DataTable reservation_table = new DataTable();
MySqlDataAdapter adapter_reservartion = new MySqlDataAdapter(string_reservation, DBconnector.Connection);
adapter_reservartion.Fill(reservation_table);
//Gets the row of each table
int local = reservation_table.Rows.Count;
int online = persons.Count;
//Compares rows of online and local database
if (local < online)
{
//if the rows of online database is greater than local database
//inserts the new data from local database
string Command_membership = "INSERT INTO reservation_details (res_no,mem_fname) VALUES (#res_no, #mem_fname);";
for (int i = local; i < online; i++)
{
//inserts new data from online to local database
using (MySqlCommand myCmd = new MySqlCommand(Command_membership, DBconnector.Connection))
{
myCmd.CommandType = CommandType.Text;
myCmd.Parameters.AddWithValue("#res_no", persons[i].res_no);
myCmd.Parameters.AddWithValue("#mem_fname", persons[i].mem_fname);
myCmd.ExecuteNonQuery();
}
}
MessageBox.Show("New Records Found");
}
else
{
MessageBox.Show("No new Records");
}
DBconnector.Connection.Close();
}
So my question is there any problem could occur with this code, it works fine but is there any way to improve this. I know MySQL replication is better but I am only using free Web Hosting with few MySQL privileges.
The clear improvement is not to create a new command for every row. You should either create the command and parameters once and then set the parameters and call for each row, or better still package the set of updates into a single structure, like and xml string, and then pass the whole lot to the database via a stored procedure call.
Other probably problematic issue is that you are checking purely based on row counts. Don't know if that is valid in your scenario but it sounds dangerous. What if rows are deleted? or is that not possible in your scenario. Some other way of checking last updates would probably be preferable.
Without more context that's about all I can see.
After Sql server 2016 we can select data direct as JSON by this statement:
SELECT Top (10) * from Products FOR JSON AUTO
So we no longer need to assign them to objects and convert them to JSON into code.
I think we can reduce the complexity of the process and get better performance.
I use web API 2 and I want to receive and direct send it to cleint .
Is this any new function or method works with SqlCommand to do this? Cloud you help me please?
This sample shows how to read the JSON from SQL Server using a SqlCommand:
var queryWithForJson = "SELECT ... FOR JSON";
var conn = new SqlConnection("<connection string>");
var cmd = new SqlCommand(queryWithForJson, conn);
conn.Open();
var jsonResult = new StringBuilder();
var reader = cmd.ExecuteReader();
if (!reader.HasRows)
{
jsonResult.Append("[]");
}
else
{
while (reader.Read())
{
jsonResult.Append(reader.GetValue(0).ToString());
}
}
In your ApiController, you can return the string using the ResponseMessage-method:
public IHttpActionResult Get()
{
var jsonResult = new StringBuilder();
/// get JSON
var response = new HttpResponseMessage(System.Net.HttpStatusCode.OK);
response.Content = new StringContent(jsonResult.ToString());
return ResponseMessage(response);
}
However, though technically feasible, IMHO there are some disadvantages that you should take into account when going this route:
You loose the ability to negotiate the content type that you return from your Web API. If you later on have to serve a client that requests XML, you cannot do this easily.
Another disadvantage, maybe minor disadvantage, is that you reduce the ability to scale the JSON conversion. Usually you have one database server whereas you can have several web frontends. Obviously you need the database server to get the data, but you can put the load of the conversion in a place that you can scale better.
I assume that it is more efficient to let SQL Server deliver the data in binary format to the frontends that perform the conversion. I doubt that it will be much faster to put this load on the database server - of course this depends on the infrastructure.
I develops a C# Winform application, it is a client and connect to web service to get data. The data returned by webservice is a DataTable. Client will display it on a DataGridView.
My problem is that: Client will take more time to get all data from server (web service is not local with client). So I must to use a thread to get data. This is my model:
Client create a thread to get data -> thread complete and send event to client -> client display data on datagridview on a form.
However, when user closes the form, user can open this form in another time, and client must get data again. This solution will cause the client slowly.
So, I think about a cached data:
Client <---get/add/edit/delete---> Cached Data ---get/add/edit/delete--->Server (web service)
Please give me some suggestions.
Example: cached data should be developed in another application which is same host with client? Or cached data is running in client.
Please give me some techniques to implement this solution.
If having any examples, please give me.
Thanks.
UPDATE : Hello everyone, maybe you think my problem so far. I only want to cache data in client's lifetime. I think cache data should be stored in memory. And when client want to get data, it will check from cache.
If you're using C# 2.0 and you're prepared to ship System.Web as a dependency, then you can use the ASP.NET cache:
using System.Web;
using System.Web.Caching;
Cache webCache;
webCache = HttpContext.Current.Cache;
// See if there's a cached item already
cachedObject = webCache.Get("MyCacheItem");
if (cachedObject == null)
{
// If there's nothing in the cache, call the web service to get a new item
webServiceResult = new Object();
// Cache the web service result for five minutes
webCache.Add("MyCacheItem", webServiceResult, null, DateTime.Now.AddMinutes(5), Cache.NoSlidingExpiration, System.Web.Caching.CacheItemPriority.Normal, null);
}
else
{
// Item already in the cache - cast it to the right type
webServiceResult = (object)cachedObject;
}
If you're not prepared to ship System.Web, then you might want to take a look at the Enterprise Library Caching block.
If you're on .NET 4.0, however, caching has been pushed into the System.Runtime.Caching namespace. To use this, you'll need to add a reference to System.Runtime.Caching, and then your code will look something like this:
using System.Runtime.Caching;
MemoryCache cache;
object cachedObject;
object webServiceResult;
cache = new MemoryCache("StackOverflow");
cachedObject = cache.Get("MyCacheItem");
if (cachedObject == null)
{
// Call the web service
webServiceResult = new Object();
cache.Add("MyCacheItem", webServiceResult, DateTime.Now.AddMinutes(5));
}
else
{
webServiceResult = (object)cachedObject;
}
All these caches run in-process to the client. Because your data is coming from a web service, as Adam says, you're going to have difficulty determining the freshness of the data - you'll have to make a judgement call on how often the data changes and how long you cache the data for.
Do you have the ability to make changes/add to the webservice?
If you can Sync Services may be an option for you. You can define which tables are syncronised, and all the sync stuff is managed for you.
Check out
http://msdn.microsoft.com/en-us/sync/default.aspx
and shout if you need more information.
You might try the Enterprise Library's Caching Application Block. It's easy to use, stores in memory and, if you ever need to later, it supports adding a backup location for persisting beyond the life of the application (such as to a database, isolated storage, file, etc.) and even encryption too.
Use EntLib 3.1 if you're stuck with .NET 2.0. There's not much new (for caching, at least) in the newer EntLibs aside from better customization support.
Identify which objects you would like to serialize, and cache to isolated storage. Specify the level of data isolation you would like (application level, user level, etc).
Example:
You could create a generic serializer, a very basic sample would look like this:
public class SampleDataSerializer
{
public static void Deserialize<T>(out T data, Stream stm)
{
var xs = new XmlSerializer(typeof(T));
data = (T)xs.Deserialize(stm);
}
public static void Serialize<T>(T data, Stream stm)
{
try
{
var xs = new XmlSerializer(typeof(T));
xs.Serialize(stm, data);
}
catch (Exception e)
{
throw;
}
}
}
Note that you probably should put in some overloads to the Serialize and Deserialize methods to accomodate readers, or any other types you are actually using in your app (e.g., XmlDocuments, etc).
The operation to save to IsolatedStorage can be handled by a utility class (example below):
public class SampleIsolatedStorageManager : IDisposable
{
private string filename;
private string directoryname;
IsolatedStorageFile isf;
public SampleIsolatedStorageManager()
{
filename = string.Empty;
directoryname = string.Empty;
// create an ISF scoped to domain user...
isf = IsolatedStorageFile.GetStore(IsolatedStorageScope.User |
IsolatedStorageScope.Assembly | IsolatedStorageScope.Domain,
typeof(System.Security.Policy.Url), typeof(System.Security.Policy.Url));
}
public void Save<T>(T parm)
{
using (IsolatedStorageFileStream stm = GetStreamByStoredType<T>(FileMode.Create))
{
SampleDataSerializer.Serialize<T>(parm, stm);
}
}
public T Restore<T>() where T : new()
{
try
{
if (GetFileNameByType<T>().Length > 0)
{
T result = new T();
using (IsolatedStorageFileStream stm = GetStreamByStoredType<T>(FileMode.Open))
{
SampleDataSerializer.Deserialize<T>(out result, stm);
}
return result;
}
else
{
return default(T);
}
}
catch
{
try
{
Clear<T>();
}
catch
{
}
return default(T);
}
}
public void Clear<T>()
{
if (isf.GetFileNames(GetFileNameByType<T>()).Length > 0)
{
isf.DeleteFile(GetFileNameByType<T>());
}
}
private string GetFileNameByType<T>()
{
return typeof(T).Name + ".cache";
}
private IsolatedStorageFileStream GetStreamByStoredType<T>(FileMode mode)
{
var stm = new IsolatedStorageFileStream(GetFileNameByType<T>(), mode, isf);
return stm;
}
#region IDisposable Members
public void Dispose()
{
isf.Close();
}
}
Finally, remember to add the following using clauses:
using System.IO;
using System.IO.IsolatedStorage;
using System.Xml.Serialization;
The actual code to use the classes above could look like this:
var myClass = new MyClass();
myClass.name = "something";
using (var mgr = new SampleIsolatedStorageManager())
{
mgr.Save<MyClass>(myClass);
}
This will save the instance you specify to be saved to the isolated storage. To retrieve the instance, simply call:
using (var mgr = new SampleIsolatedStorageManager())
{
mgr.Restore<MyClass>();
}
Note: the sample I've provided only supports one serialized instance per type. I'm not sure if you need more than that. Make whatever modifications you need to support further functionalities.
HTH!
You can serialise the DataTable to file:
http://forums.asp.net/t/1441971.aspx
Your only concern then is deciding when the cache has gone stale. Perhaps timestamp the file?
In our implementation every row in the database has a last-updated timestamp. Every time our client application accesses a table we select the latest last-updated timestamp from the cache and send that value to the server. The server responds with all the rows that have newer timestamps.