Xamarin crossplatform Read an xml from online - c#

I have an xml file on server www.testsite.com/sample_file.xml
the structure of xml is like this
<mobileclients>
<clientitem>
<code>SXPFBD</code>
<api>http://SPFD.azurewebsites.net/APIs/</api>
</clientitem>
<clientitem>
<code>STYPFBD</code>
<api>http://SPFD.azurewebsites.net/APIs/</api>
</clientitem>
</mobileclients>
I like to check the code input is present in xml or not using a xamarin cross platform application
I cant keep the file as a resource Since it will keep updating So is it possible to read from an online xml file on the go

Sure, although I would recommend switching to JSON if you can, it is still possible to work with XML.
Here is a code snippet from an app I have developed.
public List<NewsItem> GetNewsItems()
{
var newsXml = XDocument.Load("http://xxxxxxxx.nl/index.php?page=news");
var newsItems = (from newsitem in newsXml.Descendants("newsitem")
select new NewsItem
{
Title = WebUtility.HtmlDecode(newsitem.Element("title").Value),
Message = newsitem.Element("message").Value,
Date = DateTime.ParseExact(newsitem.Element("date").Value, "dd-MM-yyyy HH:mm", CultureInfo.InvariantCulture), // TODO better error handling
User = newsitem.Element("user").Value,
Replies = Convert.ToInt32(newsitem.Element("replies").Value),
Url = newsitem.Element("budgetgamingurl").Value,
Views = Convert.ToInt32(newsitem.Element("views").Value)
}).ToList();
return newsItems;
}
As you can see it loads the document from a URL and handles the XPath queries in-memory.
It is a bit rough, it could use some error handling, but you'll get the basic idea.

Related

Get value from Twitch API

Currently trying to make a system that will change a button's color based on if the streamer on the button is live or not. I have a way to download the json string into a variable but I don't know what to do with that. I know I have to check if the variable "stream" in the json output is null which means the streamer is offline but I have 0 clue on how to do that.
I'll edit it with the code that I currently have. I got the json being properly parsed, doing r.stream gives me the appropriate data, but I can't figure out how to figure out if the stream is live or not. This is supposed to check on button press which will refresh the data.
private void Refresh_Click(object sender, RoutedEventArgs e)
{
string url = #"https://api.twitch.tv/kraken/streams/camoduck?client_id=xskte44y2wfqin464ayecyc09nikcj";
var json = new WebClient().DownloadString(url);
Rootobject r = JsonConvert.DeserializeObject<Rootobject>(json);
Console.WriteLine(r.stream);
if r.stream.game = "Grand Theft Auto V"
{
_1GUnit1.Background = Brushes.Red;
}
}
.......
var json = new WebClient().DownloadString(url);
var r = JsonConvert.DeserializeObject<Rootobject>(json);
Console.WriteLine(r.stream);
if (r.stream==null) //How a null check can be done
{
_1GUnit1.Background = Brushes.Red;
}
BTW: If you are using "http://json2csharp.com/", It is propably RootObject not Rootobject
Without you providing more details about what doesn't work in your case, I can't give an in-depth explanation or guidance.
What I can suggest is that you a use an API wrapper, e.g. TwitchLib.
That should help you get started and should provide enough documentation for your case.
You should do more reading on the Twitch API and search for examples. When the stream is offline, consider dropping in some sort of template as there is no data to from the request to parse. When a stream is online, you will have access to the stream object's properties. For example, if your success function returns data, you can assign the results as:
game = data.stream.game;
logo = data.stream.channel.logo;
name = data.stream.channel.name;
url = data.stream.channel.url;
stream = data.stream.stream_type;
This assumes you've setup the appropriate variables (you didn't provide any code).
I would also recommend you spend some time learning how to debug in the browser. More specifically in this case, learn how to inspect your result data. This will demystify what's in the object as you'll see the data and its properties, etc.
Have a look at the following Stack post:
Inspecting large JSON data in Chrome

Saving, loading and manipulating (listview) data C#

I'm working on an UWP application where a user can input data which is placed in a listview. All fine and dandy, but how can I save the user data to a separate file and load it the next time a user boots up the app?
I've tried to find a solution, but I had great difficulty to understand these code snippets and on how to apply these (since I'm fairly new to C# and App development). Would somebody like to explain how I can achieve the saving/loading of the data and explain what the code does?
Thanks in advance! :)
You can create a file like this:
StorageFile ageFile = await local.CreateFileAsync("Age.txt", CreationCollisionOption.FailIfExists);
I can read and write to a file like this:
StorageFolder local = Windows.Storage.ApplicationData.Current.LocalFolder;
var ageFile = await local.OpenStreamForReadAsync(#"Age.txt");
// Read the data.
using (StreamReader streamReader = new StreamReader(ageFile))
{
//Use like a normal streamReader
}
if you are trying to write, use OpenStreamForWriteAsync;
If I understood well, you have some kind of object structure that serves as a model for your ListView. When the application is started, you want to read a file where the data is present. When closing the application (or some other event) write the file with the changes done. Right?
1) When your application is loaded / closed (or upon modifications or some event of your choice), use the Windows.Storage API to read / write the text into the file.
2) If the data you want to write is just a liste of strings, you can save this as is in the file. If it is more complicated, I would recommend serializing it in JSON format. Use JSON.NET to serialize (object -> string) and deserialize (object <- string) the content of your file and object structure.
Product product = new Product();
product.Name = "Apple";
...
string json = JsonConvert.SerializeObject(product);

asp.net Web Form Inputs to CSV File

I have created a web form in asp.NET, unfortunately i do not have the pleasure of being able to use SQL due to some restrictions (internal). I had the idea of exporting the data to a CSV file every time the user clicks submit.
The purpose of this is so that we have a list of computer names, softwares installed, Serial #, ect to keep track.
I have never done anything like this, as i am used to SQL (beginner level), is this even possible, and how would i do it (in code form). I've googled it and all i'm getting is grid view and other forms that i don't want to use, even those seem more complicated than what i want to do. Any help would be appreciated.
I'm using a C# back end to code this.
To clarify, this will be a single CSV file on our network, this web form will only be used internally. So every time someone clicks the submit button it will add a new "row" to the same CSV file. So in the end we have one CSV file with a list of computer names and other information.
Typically you have a model class that represents the data. Here's one:
public class Computer
{
public string SerialNumber {get; set;}
public string Name {get; set;}
public List<string> InstalledSoftware {get; set;}
}
Now that you have a class that can represent the data, it's just a matter of saving or serializing it. You don't have access to a SQL Server database. That's fine. There's other options. You can store it in a structured file format. CSV is not good for this, as you might have multiple pieces of InstalledSoftware per computer, and it's hard to properly handle that with CSV. But other text based formats such as XML and JSON are perfect for this. You can also use "NoSQL" type databases such as MongoDB or RavenDB. You may also be able to use SQLite, which is very lightweight.
Let's start off with some sample data.
List<Computer> computers = new List<Computer>();
computers.Add(new Computer(){
SerialNumber="ABC123",
Name="BOB-LAPTOP",
InstalledSoftware = new List<string>(){
"Notepad", "Visual Studio", "Word"
}
});
computers.Add(new Computer(){
SerialNumber="XYZ456",
Name="JASON-WORKSTATION",
InstalledSoftware = new List<string>(){
"Notepad++", "Visual Studio Code", "Word"
}
});
computers.Add(new Computer(){
SerialNumber="LMN789",
Name="NANCY-SURFACE3",
InstalledSoftware = new List<string>(){
"Outlook", "PowerPoint", "Excel"
}
});
Then it's just a matter of saving the data. Let's try with XML:
var xmlSerializer = new XmlSerializer(typeof(Computer));
using(var stringWriter = new StringWriter())
{
using (var xmlWriter = XmlWriter.Create(stringWriter))
{
xmlSerializer .Serialize(writer, computers);
var xml = stringWriter.ToString();
File.WriteAllText(Server.MapPath("~/App_Data/computers.xml"));
}
}
Or with JSON:
var serializer = new JavaScriptSerializer();
var json = serializer.Serialize(computers);
File.WriteAllText(Server.MapPath("~/App_Data/computers.json"));
Using MongoDB:
var client = new MongoClient(connectionString);
var server = client.GetServer();
var database = server.GetDatabase("ComputerDB");
var computerCollection= database.GetCollection<Computer>("Computers");
foreach(var computer in computers)
{
computerCollection.Insert(computer);
}
Note, I have not tested this code. There's likely bugs. My goal is to give you a starting point, not necessarily 100% working code. I haven't serialized XML in a while and I typically use JSON.NET instead of the built in JavaScriptSerializer.
Also note that if there's any possibility that two users might access the data at the same time, you'll need to take care with the XML and JSON approaches to avoid two people trying to write to the file at once. That's where MongoDB would be better, but you'll have to install the server software somewhere.

Pulling data from a webpage, parsing it for specific pieces, and displaying it

I've been using this site for a long time to find answers to my questions, but I wasn't able to find the answer on this one.
I am working with a small group on a class project. We're to build a small "game trading" website that allows people to register, put in a game they have they want to trade, and accept trades from others or request a trade.
We have the site functioning long ahead of schedule so we're trying to add more to the site. One thing I want to do myself is to link the games that are put in to Metacritic.
Here's what I need to do. I need to (using asp and c# in visual studio 2012) get the correct game page on metacritic, pull its data, parse it for specific parts, and then display the data on our page.
Essentially when you choose a game you want to trade for we want a small div to display with the game's information and rating. I'm wanting to do it this way to learn more and get something out of this project I didn't have to start with.
I was wondering if anyone could tell me where to start. I don't know how to pull data from a page. I'm still trying to figure out if I need to try and write something to automatically search for the game's title and find the page that way or if I can find some way to go straight to the game's page. And once I've gotten the data, I don't know how to pull the specific information I need from it.
One of the things that doesn't make this easy is that I'm learning c++ along with c# and asp so I keep getting my wires crossed. If someone could point me in the right direction it would be a big help. Thanks
This small example uses HtmlAgilityPack, and using XPath selectors to get to the desired elements.
protected void Page_Load(object sender, EventArgs e)
{
string url = "http://www.metacritic.com/game/pc/halo-spartan-assault";
var web = new HtmlAgilityPack.HtmlWeb();
HtmlDocument doc = web.Load(url);
string metascore = doc.DocumentNode.SelectNodes("//*[#id=\"main\"]/div[3]/div/div[2]/div[1]/div[1]/div/div/div[2]/a/span[1]")[0].InnerText;
string userscore = doc.DocumentNode.SelectNodes("//*[#id=\"main\"]/div[3]/div/div[2]/div[1]/div[2]/div[1]/div/div[2]/a/span[1]")[0].InnerText;
string summary = doc.DocumentNode.SelectNodes("//*[#id=\"main\"]/div[3]/div/div[2]/div[2]/div[1]/ul/li/span[2]/span/span[1]")[0].InnerText;
}
An easy way to obtain the XPath for a given element is by using your web browser (I use Chrome) Developer Tools:
Open the Developer Tools (F12 or Ctrl + Shift + C on Windows or Command + Shift + C for Mac).
Select the element in the page that you want the XPath for.
Right click the element in the "Elements" tab.
Click on "Copy as XPath".
You can paste it exactly like that in c# (as shown in my code), but make sure to escape the quotes.
You have to make sure you use some error handling techniques because Web scraping can cause errors if they change the HTML formatting of the page.
Edit
Per #knocte's suggestion, here is the link to the Nuget package for HTMLAgilityPack:
https://www.nuget.org/packages/HtmlAgilityPack/
I looked and Metacritic.com doesn't have an API.
You can use an HttpWebRequest to get the contents of a website as a string.
using System.Net;
using System.IO;
using System.Windows.Forms;
string result = null;
string url = "http://www.stackoverflow.com";
WebResponse response = null;
StreamReader reader = null;
try
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
request.Method = "GET";
response = request.GetResponse();
reader = new StreamReader(response.GetResponseStream(), Encoding.UTF8);
result = reader.ReadToEnd();
}
catch (Exception ex)
{
// handle error
MessageBox.Show(ex.Message);
}
finally
{
if (reader != null)
reader.Close();
if (response != null)
response.Close();
}
Then you can parse the string for the data that you want by taking advantage of Metacritic's use of meta tags. Here's the information they have available in meta tags:
og:title
og:type
og:url
og:image
og:site_name
og:description
The format of each tag is: meta name="og:title" content="In a World..."
I recommend Dcsoup. There's a nuget package for it and it uses CSS selectors so it is familiar if you use jquery. I've tried others but it is the best and easiest to use that I've found. There's not much documentation, but it's open source and a port of the java jsoup library that has good documentation. (Documentation for the .NET API here.) I absolutely love it.
var timeoutInMilliseconds = 5000;
var uri = new Uri("http://www.metacritic.com/game/pc/fallout-4");
var doc = Supremes.Dcsoup.Parse(uri, timeoutInMilliseconds);
// <span itemprop="ratingValue">86</span>
var ratingSpan = doc.Select("span[itemprop=ratingValue]");
int ratingValue = int.Parse(ratingSpan.Text);
// selectors match both critic and user scores
var scoreDiv = doc.Select("div.score_summary");
var scoreAnchor = scoreDiv.Select("a.metascore_anchor");
int criticRating = int.Parse(scoreAnchor[0].Text);
float userRating = float.Parse(scoreAnchor[1].Text);
I'd recomend you WebsiteParser - it's based on HtmlAgilityPack (mentioned by Hanlet EscaƱo) but it makes web scraping easier with attributes and css selectors:
class PersonModel
{
[Selector("#BirdthDate")]
[Converter(typeof(DateTimeConverter))]
public DateTime BirdthDate { get; set; }
}
// ...
PersonModel person = WebContentParser.Parse<PersonModel>(html);
Nuget link

How to trigger an executable upon update of an RSS feed

I have an RSS feed URL, that I can view in any Feed Reader.
This RSS feed is not controlled by me, it is only consumed by me.
This RSS Feed (Office of Inspector General's Excluded Provider List) links to a page with download-able files.
These files are updated approximately once a month, and the RSS feed displays new "unread" items.
What I want to do is write something (in C#) that checks this RSS Feed once a week, and when a new item (i.e. a new download-able file) is available, triggers off an executable.
This is essentially like a very scaled-down RSS Reader, with the sole purpose of triggering an executable when a new item appears.
Any guidance, advice would be greatly appreciated.
Edit:
I need help in determining when a new
item becomes available for
download.
The running of an
executable I can do.
The
executable that will run, will process
the downloaded file.
As a commenter already noted, this question is quite broad, but here's an attempt to answer:
You can either write a Windows Service (use a template that comes with VS/MonoDevelop) or you can write a simple console app that would be called by Windows Scheduler or Cron.
The main code will use one of the many RSS feed parsers available:
There are plenty of examples here on SO. IMO, the simplest LINQ-based is here
I personally like this approach, also using LINQ.
Once you parse the feed, you need to look for the value of the Link element, found by doing this from the SO example above:
....
var feeds = from feed in feedXML.Descendants("item")
select new
{
Title = feed.Element("title").Value,
**Link** = feed.Element("link").Value,
Description = feed.Element("description").Value
};
....
So, now that you have the executable, you'll need to download it to your machine. I suggest you look into this example from MSDN:
Now, that you have the file downloaded, simple use Process.Start("Path to EXE"); to execute it.
Watch out for viruses in the exes!!!
If you are using .Net 3.5 or above you can you the various classes within the System.ServiceModel.Syndication namespace, specifically the SyndicationFeed class which exposes a LastUpdatedTime property that you can use to compare dates to know when to call your executable using the Process.Start method in the System.Diagnostics namespace.
using (XmlReader reader = XmlReader.Create(path))
{
SyndicationFeed feed = SyndicationFeed.Load(reader);
if ((feed != null) && (feed.LastUpdateTime > feedLastUpdated))
{
// Launch Process
}
}
So you have to read the RSS feed from the URL, and then parse the data to determine whether a new item is available.
To read the feed, you'll want to use a WebClient. The simplest way:
var MyClient = new WebClient();
string rssData = MyClient.DownloadString("http://whatever");
You can then create an XML document from the returned string.
var feedXML = new XMlDocument();
feedXML.Load(rssData);
#dawebber shows how to parse the XML with LINQ. You'll want to check the date on each item to see if it's newer than the last date checked. Or perhaps you have a database of items that you've already seen and you want to check to see if the items you received are in the database.
Whenever you find a new item, you can fire off your executable using Process.Start.
You could write a System Tray application. I've done several that screen scrape/monitor sites on a scheduled basis. Here is a VERY simple start. I think you could do what you're looking for in a few hours.
http://alperguc.blogspot.com/2008/11/c-system-tray-minimize-to-tray-with.html

Categories

Resources