I am trying to get all my status messages using the following code:
public dynamic downloadStatuses(FacebookOAuthResult facebookOAuthResult, string userInput)
{
dynamic result = null;
if (facebookOAuthResult != null)
{
if (facebookOAuthResult.IsSuccess)
{
this.accessToken = facebookOAuthResult.AccessToken;
var fb = new FacebookClient(facebookOAuthResult.AccessToken);
result = fb.Get(userInput + "/statuses?format=json&limit=1500");
return result;
}
else
{
MessageBox.Show(facebookOAuthResult.ErrorDescription);
return result;
}
}
return result;
}
userInput would be my Facebook ID. Some statuses are returned but definitely does not total up to the 1500 that I have indicated. Was wondering if anyone knew how to access your first ever facebook status message or retrieve all the status message by slightly modifying the request url in the code. Is it a must to use fql?
When you query /statuses you don't get the Photo/Album/video related statuses/entries from your stream (including their comments etc). This also includes event creation and questions.
You will have to query the API with separate calls (i.e. /me/photos) to get everything. If you want to have almost the full stream of some person or page or group. Consider using FQL, querying the stream table.
EDIT : Just saw this question is probably outdated, my bad
Related
I need to read all users from the AD. Here is code that I am using:
using Novell.Directory.Ldap;
using Novell.Directory.Ldap.Controls;
using System.Linq;
namespace LdapTestApp
{
class Program
{
static void Main()
{
LdapConnection ldapConn = new LdapConnection();
ldapConn.SecureSocketLayer = true;
ldapConn.Connect(HOST, PORT);
try
{
var cntRead = 0;
int? cntTotal = null;
var curPage = 0;
ldapConn.Bind(USERNAME, PASSWORD);
do
{
var constraints = new LdapSearchConstraints();
constraints.SetControls(new LdapControl[]
{
new LdapSortControl(new LdapSortKey("sn"), true),
new LdapVirtualListControl("sn=*", 0, 10)
});
ILdapSearchResults searchResults = ldapConn.Search(
"OU=All Users,DC=homecredit,DC=ru",
LdapConnection.ScopeSub,
"(&(objectCategory=person)(objectClass=user))",
null,
false,
constraints
);
while (searchResults.HasMore() && ((cntTotal == null) || (cntRead < cntTotal)))
{
++cntRead;
try
{
LdapEntry entry = searchResults.Next();
}
catch (LdapReferralException)
{
continue;
}
}
++curPage;
cntTotal = GetTotalCount(searchResults as LdapSearchResults);
} while ((cntTotal != null) && (cntRead < cntTotal));
}
finally
{
ldapConn.Disconnect();
}
}
private static int? GetTotalCount(LdapSearchResults results)
{
if (results.ResponseControls != null)
{
var r = (from c in results.ResponseControls
let d = c as LdapVirtualListResponse
where (d != null)
select (LdapVirtualListResponse)c).SingleOrDefault();
if (r != null)
{
return r.ContentCount;
}
}
return null;
}
}
}
I used this question Page LDAP query against AD in .NET Core using Novell LDAP as basis.
Unfortunatelly I get this exception when I am trying to recieve the very first entry:
"Unavailable Critical Extension"
000020EF: SvcErr: DSID-03140594, problem 5010 (UNAVAIL_EXTENSION), data 0
What am I doing wrong?
VLVs are browsing indexes and are not directly related to the possibility or not to browse large numbers of entries (see generic documentation). So even if this control would be activated on your AD, you wouldn't be able to retrieve more than 1000 elements this way :
how VLVs work on AD
MaxPageSize is 1000 by default on AD (see documentation)
So what you can do:
use a specific paged results control, but it seems that the Novell C# LDAP library does not have one
ask you the question: "is this pertinent to look for all the users in a single request?" (your request looks like a batch request: remember that a LDAP server is not designed for the same purposes than a classic database - that can easily return millions of entries - and that's why most of LDAP directories have default size limits around 1000).
The answer is no: review your design, be more specific in your LDAP search filter, your search base, etc.
The answer is yes:
you have a single AD server: ask your administrator to change the MaxPageSize value, but this setting is global and can lead to several side effects (ie. what happens if everybody start to request all the users all the time?)
you have several AD servers: you can configure one for specific "batch like" queries like the one you're trying to do (so large MaxPageSize, large timeouts etc.)
I had to use approach described here:
https://github.com/dsbenghe/Novell.Directory.Ldap.NETStandard/issues/71#issuecomment-420917269
The solution is far from being perfect but at least I am able to move on.
Starting with version 3.5 the library supports Simple Paged Results Control - https://ldapwiki.com/wiki/Simple%20Paged%20Results%20Control - and the usage is as simple as ldapConnection.SearchUsingSimplePaging(searchOptions, pageSize) or ldapConnection.SearchUsingSimplePaging(ldapEntryConverter, searchOptions, pageSize) - see Github repo for more details - https://github.com/dsbenghe/Novell.Directory.Ldap.NETStandard and more specifically use the tests as usage samples.
I am attempting to create the backend of a Twitch Extension that will search for all other users streaming the same game. The only way to get the Category with which to compare games is if the channel in question is live. This is fine, as the extension should will only be working whenever the stream goes live.
My issue is that I do not know how I can determine whether or not the user of the extension has gone online.
public static async Task<string> GetStreamAsync()
{
string product = "";
List<Stream> finalProduct = new List<Stream>();
var response = await _httpClient.GetAsync("https://api.twitch.tv/helix/streams?user_login=Ninja");
if (response.IsSuccessStatusCode)
{
product = await response.Content.ReadAsStringAsync();
var streamToRelate = JsonConvert.DeserializeObject<RootStream>(product);
finalProduct = await GetAllRelatedChannels(streamToRelate.Data.FirstOrDefault());
}
return product;
}
As you can see, I've hard coded a stream that I knew was online at the time (Ninja). This data was returned perfectly fine, but the next step is to pass a string into this function with which to search. How could I go about getting the "User" of the extension at runtime? If I am completely off-base, and am missing certain concepts, please let me know as well and I will do further research if necessary.
I am trying to send a login and a password and if it find in the DataBase it shold return the account details. But i am getting this error on my console:
ERROR HttpErrorResponse {headers: HttpHeaders, status: 500, statusText: "Internal Server Error", url: "http://localhost:56624/api/Usuarios"
And here is my API code:
`
public Usuarios Post([FromBody]LoginRequest login)
{
var usuario = new Usuarios();
var con = new SqlConnection(System.Configuration.ConfigurationManager.ConnectionStrings["comidas"].ConnectionString);
var com = new SqlCommand("SELECT * FROM usuarios WHERE loginUsuario = #loginUsuario AND senhaUsuario = #senhaUsuario", con);
com.Parameters.AddWithValue("#loginUsuario", login.loginUsuario);
com.Parameters.AddWithValue("#senhaUsuario", login.senhaUsuario);
con.Open();
var rdr = com.ExecuteReader();
if (rdr.HasRows)
{
usuario.idUsuario = (int)rdr["idUsuario"];
usuario.loginUsuario = (string)rdr["loginUsuario"];
usuario.senhaUsuario = (string)rdr["senhaUsuario"];
usuario.nomeUsuario = (string)rdr["nomeUsuario"];
usuario.emailUsuario = (string)rdr["emailUsuario"];
usuario.telefoneUsuario = (string)rdr["telefoneUsuario"];
return usuario;
}
con.Close();
return null;
}`
If you guys spot any mistake pleasse tell me, i am new to C#.
Here is the request code in javascript:
getUser() {
let head = new HttpHeaders();
head.set('Content-Type', 'application/json');
let body = {
loginUsuario: '*login*',
senhaUsuario: '*password*'
}
return this.http.post(this.url + 'Usuarios', body, { headers: head }).subscribe(data => {
this.user = data;
console.log(data);
});
}
and the classes:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
namespace MeinNahrungAPI.Models
{
public class Usuarios
{
public int idUsuario;
public string loginUsuario;
public string senhaUsuario;
public string tipoUsuario;
public string nomeUsuario;
public string emailUsuario;
public string telefoneUsuario;
}
}
using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
namespace MeinNahrungAPI.Models
{
public class LoginRequest
{
public string loginUsuario;
public string senhaUsuario;
}
}
enter image description here
enter image description here
As a result of the comment thread on the question...
You're getting an exception here:
usuario.idUsuario = (int)rdr["idUsuario"];
I'm not familiar with the specific translation of the exception from your Portuguese settings, but essentially what it means is that you're attempting to read data when none is available. Which is slightly misleading, since there is data returned from the query (which is why rdr.HasRows resolves to true). But you haven't called rdr.Read() to access the first record in the results. Even if there's only one record, technically it's still a collection of records which happens to have only one entry. And a DataReader needs to be told to progress through that collection.
Take a look at some examples here. Specifically at how they structure reading the results:
while (reader.Read())
{
//...
}
In this case calling .Read() returns a bool indicating whether or not there is a next record to read, but it also is quietly advancing the DataReader internally to access that next record.
You can use this same loop structure and expect it to only loop one time since there should be only one returned record. (It wouldn't hurt to add in some error checking to handle unexpected possibilities if more than one record is returned.) Or you could even just call .Read() once:
if (rdr.HasRows)
{
rdr.Read();
// the rest of your code
}
Another thing to note, which is unrelated to the problem but is important regardless. First off, as a beginner, the fact that you're using query parameters instead of directly concatenating string values is a very good thing. However, there's another security problem in your code that you should be made aware of. You are storing user passwords in plain text. This is a very bad thing. You don't want to be able to ever see or know your users' passwords.
There are existing authentication tools in the .NET Framework which can do much of the work for you. But even when creating your own, you should always hash the passwords. (Not encrypt, hash. It's a very important distinction.) And with good hashing methods (not MD5). That way nobody can ever recover the original password. Not an attacker, not even you as the system owner.
At the momment I'm using Cosmos DB fixed 10 gb where I save all my data in same collection. I want to continue this way. So do I just need to create an unlimited collection. When I do that it ask for a partion key which I don't quite understand what it is for. I have tried to read about it but didn't get much clever. Hope you guys can help me.
Today I create documents like this. Should it be done in another way if using unlimited collection. Thinking about should i declare the partion key somewhere?:
protected async Task<bool> CreateDocumentAsync(Resource document)
{
var collectionUri = UriFactory.CreateDocumentCollectionUri(_db.Options.Value.DatabaseName, _db.Options.Value.CollectionName);
ResourceResponse<Document> result = null;
for (int i = 0; i < MaxRetryCount; i++)
{
try
{
result = await _db.Client.CreateDocumentAsync(UriFactory.CreateDocumentCollectionUri(_db.Options.Value.DatabaseName, _db.Options.Value.CollectionName), document);
break;
}
catch (DocumentClientException dex) when (dex.StatusCode.HasValue && (int)dex.StatusCode.Value == 429)
{
_logger.LogWarning($"");
await Task.Delay(dex.RetryAfter);
}
}
if (result == null)
return false;
int statusCode = (int)result.StatusCode;
return statusCode >= 200 && statusCode < 300;
}
The partition key value can be specified by providing the RequestOptions object with the PartitionKey value set.
Your create document line would become:
var requestOptions = new RequestOptions{
PartitionKey = new PartitionKey("yourPartitionKeyValueHere");
}
result = await _db.Client.CreateDocumentAsync(UriFactory.CreateDocumentCollectionUri(_db.Options.Value.DatabaseName, _db.Options.Value.CollectionName), document, requestOptions);
However it does should like you don't understand CosmosDB partitioning which means you can get in really deep pitfall.
The partition key definition should be a property that most of the times if not always, is known to you when you are doing querying or reading of documents.
I highly recommend you watch this video. It will greatly help you understand partitioning: https://azure.microsoft.com/en-us/resources/videos/azure-documentdb-elastic-scale-partitioning/
On a side note, I can see several mistakes in the way you are using the CosmosDB SDK which I cannot fix in a single answer. For example, the CosmosDB SDK has already logic to handle the retry policy logic on the document client level via the RetryOptions object so you don't need to write any custom code. I would suggest you read more of the CosmosDB documentation.
I have posted question regarding firebase two days ago:
Android Firebase - add authenticated user into database
I got help that I needed and that solved first problem. But now I have a new problem. I was googling for quite some time, there are some posts about this issue but nothing solved my problem. I din't want to spam the previous question so I posted a new one.
When I try reading inserted data from the firebase database I get this error:
Newtonsoft.Json.JsonSerializationException: Error converting value
"test#user.com" to type 'carServiceApp.My_Classes.Account'. Path
'email', line 1, position 24.
Here is the code:
private async Task LoadData()
{
FirebaseUser users = FirebaseAuth.GetInstance(loginActivity.app).CurrentUser;
id = users.Uid;
var firebase = new FirebaseClient(loginActivity.FirebaseURL);
var items = await firebase.Child("users").Child(id).OnceAsync<Account>();
foreach (var item in items)
{
Account user = new Account();
user.uid = item.Object.uid;
user.name = item.Object.name;
user.lastName = item.Object.lastName;
user.phone = item.Object.phone;
user.email = item.Object.email;
userInput_ime.Text = user.name;
userInput_prezime.Text = user.lastName;
userInput_broj.Text = user.phone;
userInput_email.Text = user.email;
}
}
This is firebase data:
-users
-jwAP2dYNzJeiF3QlmEIEQoruUkO2
email: "test#user.com"
lastName: "user"
name: "test"
phone: "12421"
uid: "jwAP2dYNzJeiF3QlmEIEQoruUkO2"
Interesting thing is that when I try reading data with this:
var items = await firebase.Child("users").OnceAsync<Account>();
This works fine (I get last inserted user) . But when I add 'uid' node, then I get error. I was trying to solve this for quite some time but I just can't figure it out. I guess that there is no problem with the account class because it works in the case without uid node but doesn't work when another child() method is added.
Other information (Account class code and the way of storing that data into the database) you can see in the link at the top.
Note: I tried adding constructor in Account class but that doesn't help.
Ok, so I didn't exactly find a solution for this problem nor do I really understand why was this happening but I have found a workaround. I believe it's not ideal solution and that it does not fix existing problem. Or maybe it was problem with me not understanding firebase logic but here is what I came up with.
So, considering that it was all working fine if I didn't specify that uid node it was obvious there was some problem with class and data in firebase, matching problem I guess. Anyway, I decided to have that last uid node so I can have specific user selected and also to have the same data in firebase as it was in case where it was all working. So, this is how I have inserted data into firebase:
var item = firebase.Child("users").Child(id).PostAsync<Account>(user);
This created users node and child node. And PostAsync method created one more node with random key.
So when I tried reading with this:
var data = await firebase.Child("users").Child(id).OnceAsync<Account>();
It worked without problem. Now firebase data looks like this:
users
JPKdQbwcXbhBatZ2ihBNLRauhV83
-LCXyLpvdfQ448KOPKUp
email: "spider#man.com"
lastName: "man"
name: "spider"
phone: "14412"
uid: "JPKdQbwcXbhBatZ2ihBNLRauhV83"
There is a bit of redundancy, I basically have two ID's, but I don't understand how to create my class so I can get that data any other way so I made it this way. It works fine.
If anyone has better solution, I will gladly change it. Cheers
This was suppose to be a comment, but this is just suppose to be an addition for anyone that needs help with this issue.
I know that this answer has been out there for a while but this still seems to be a running structural quirk with Firebase and the usage of their rules. I ran into this issue with a complex structure that looked kind of like this
-Orders
-9876trfghji (User ID)
-0
BusnID: "ty890oihg"
Name: "Some Name"
AddOns: Object
ItemData: Object(containing other objects)
UserID: "9876trfghji"
Note: In this case as well as the case with cordas, you will see that both of the final objects has a UserID or uid.
I also was running into the issue of class de-serialization of the object without having the actual User ID in the objects data when it was being sent back to the device.
The reason that you have a “redundant” usage of the user id is for a security measure with the Firebase rules. The first UserID with the structure above you are able to control the access to the information based off of the users id without having to have an extra validation clause in the rules. Currently as of this post the the rule below would protect the data based on the User ID.
“Orders” : {
"$uid":{
".read":"auth != null",
".write":"auth.uid == $uid"
}
}
this allows the user with only the authorized user id to write content but anyone that has valid credentials can view the data.
The second User ID has to be placed in the object because without it you would not be able to do a standard cast to the object because your object would not have all of the data it would need to create the object. Regardless of if you are using a package like GoogleGson or Newtonsoft.Json the object still isn't full.
There is how ever a work around for this problem besides re-entering the User ID into the object. With the object that I have above I decided to just re-enter the User ID in my personal code to save the time and hassle of manual creation.
Using the Firebase.Database NuGet package you can manually create the object. Here is an example of the object in cordas problem
public static void GetUser_Firebase(User user, FirebaseApp app)
{
FirebaseDatabase database = FirebaseDatabase.GetInstance(app);
DatabaseReference reference = database.GetReference($"/users/{user.UserID}");
//"Using for getting firebase information", $"/users/{user.UserID}"
reference.AddListenerForSingleValueEvent(new UserInfo_DataValue());
}
class UserInfo_DataValue : Java.Lang.Object, IValueEventListener
{
private string ID;
public UserInfo_DataValue(string uid)
{
this.ID = uid;
}
public void OnCancelled(DatabaseError error)
{
//"Failed To Get User Information For User "
}
public void OnDataChange(DataSnapshot snapshot)
{
Dictionary<string, string> Map = new Dictionary<string, string>();
var items = snapshot.Children?.ToEnumerable<DataSnapshot>(); // using Linq
foreach(DataSnapshot item in items)
{
try
{
Map.Add(item.Key, item.Value.ToString()); // item.value is a Java.Lang.Object
}
catch(Exception ex)
{
//"EXCEPTION WITH DICTIONARY MAP"
}
}
User toReturn = new User();
toReturn.UserID this.ID;
foreach (var item in Map)
{
switch (item.Key)
{
case "email":
toReturn.email = item.Value;
break;
case "lastName":
toReturn.lastName = item.Value;
break;
case "name":
toReturn.name = item.Value;
break;
case "phone":
toReturn.phone = item.Value;
break;
}
}
}
}
Update
There is something that I would like to mention that I left out when I was writing this and that is the usage of Firebase.Database NuGet package with the Gson NuGet package and the Newtonsoft.Json Library
If you decide to use the FIrebase.Database library just know that you will be working very close with the Java.Lang and the Java.Util libraries. Objects like Java.Lang.Object can be very difficult and time consuming to write the code needed to de-serialize the data, but don't fear Gson is here!
The Gson package if you allow it can take a large load of work off of your hands for class de-serialization if you allow it. Gson is a library that will allow you to do Java.Lang.Obj to json string de-serialization. I know it seems weird, hand it an object get back a string sounds counter intuitive I know but just bear with me.
Here is an example of how to us the Gson Library with the object in cordas problem.
public static void Get_User(User user, FirebaseApp app)
{
FirebaseDatabase database = FirebaseDatabase.GetInstance(app);
DatabaseReference reference = database.GetReference($"Users/{user.UserID}");
reference.AddListenerForSingleValueEvent(new User_DataValue(user, app));
//$"Trying to make call for user orders Users/{user.UserID}");
}
class User_DataValue : Java.Lang.Object, IValueEventListener
{
private User User;
private FirebaseApp app;
public UserOrderID_Init_DataValue(User user, FirebaseApp app)
{
this.User = user;
this.app = app;
}
public void OnCancelled(DatabaseError error)
{
//$"Failed To Get User Orders {error.Message}");
}
public void OnDataChange(DataSnapshot snapshot)
{
//"Data received for user orders");
var gson = new GsonBuilder().SetPrettyPrinting().Create();
var json = gson.ToJson(snapshot.Value); // Gson extention method obj -> string
Formatted_Output("Data received for user order json ", json);
User user = JsonConvert.DeserializeObject<User>(json); //Newtonsoft.Json extention method string -> object
//now the user is a fully populated object with very little work
}
For anyone that might run into this in the future I hope that this helps