Now I'm using C# MVC to implement an API, and the API data is from remote MongoDB. I found when I was testing my API, it had MongoDB connection timeout error appeared sometimes. Does anyone know how to solve this problem?
This is the connection code
var settings = new MongoClientSettings
{
Credentials = new[] { MongoCredential.CreateMongoCRCredential("database", "user", "pw") },
Server = new MongoServerAddress("XXX.XXX.XXX", XXXXX),
ConnectTimeout = TimeSpan.FromSeconds(1800),
MaxConnectionIdleTime = TimeSpan.FromSeconds(1800),
};
//Get a Reference to the Client Object
var mongoClient = new MongoClient(settings);
var mongoServer = mongoClient.GetServer();
var database = mongoServer.GetDatabase("MongoDB");
Now I changed my code as follow, but I still encounter the same problem
var settings = new MongoClientSettings
{
Credentials = new[] { MongoCredential.CreateMongoCRCredential("database", "user", "pw") },
Server = new MongoServerAddress("XXX.XXX.XXX.XXX", XXXXX),
ConnectTimeout = TimeSpan.FromSeconds(1800),
MaxConnectionIdleTime = TimeSpan.FromSeconds(1800)
};
var mongoClient = new MongoClient(settings);
var database = mongoClient.GetDatabase(ConfigurationManager.AppSettings["mongoDBdatabase"]);
return database;
and get the data by the following code:
var database = Mongo.TestConnectDB();
var collection = database.GetCollection<BsonDocument>("User");
var builder = Builders<BsonDocument>.Filter;
var filter = builder.Eq("_id", ObjectId.Parse(id));
var cursor = await collection.FindAsync(filter);
var friendList = new List<MongoModels.User.Circle.Friend>();
while (await cursor.MoveNextAsync())
{
var user = cursor.Current;
friendList=BsonSerializer.Deserialize<MongoModels.User>(user.First()).circle.friend;
}
Related
I'm using Unity and want to constantly receive mqtt-data that I can attach to a game object. So I'm trying to enable a connection like this:
async void testconnect()
{
const string server = "somewebadress.com";
const string clientId = "Test";
var caCert = new X509Certificate2(#"ca.crt");
var clientCert = new X509Certificate2(#"cert.pfx", "test");
var source = new CancellationTokenSource();
var token = source.Token;
var factory = new MqttFactory();
var mqttClient = factory.CreateMqttClient();
var mqttOptions = new MqttClientOptionsBuilder()
.WithTcpServer(server, 8883)
.WithClientId(clientId)
.WithTls(new MqttClientOptionsBuilderTlsParameters
{
UseTls = true,
AllowUntrustedCertificates = true,
Certificates = new List<X509Certificate> { caCert, clientCert }
})
.Build();
mqttClient.ConnectAsync(mqttOptions, token).Wait(token);
}
But I'm receiving the following exception:
NotImplementedException: The method or operation is not implemented.
Mono.Unity.UnityTlsContext.ProcessHandshake ()
I've tried to find it out by myself, but I have no clue how to implement the ProcessHandshake() function.
Our code is currently using the old Microsoft.WindowsAzure.Storage libraries for blob storage access in Azure. I am trying to use the new v12 Azure.Storage.Blobs libraries to replace the old ones, however I cannot figure out how to decrypt/encrypt the blobs. The MS docs (https://learn.microsoft.com/en-us/azure/storage/blobs/storage-encrypt-decrypt-blobs-key-vault?tabs=dotnet) helpfully say that the v12 code snippets aren't ready yet, so there are no code examples.
The old code is like this:
var tokenProvider = new AzureServiceTokenProvider();
var cloudResolver = new KeyVaultKeyResolver(
new KeyVaultClient.AuthenticationCallback(_tokenProvider.KeyVaultTokenCallback));
var encryptionThingy = await cloudResolver.ResolveKeyAsync(<Key Vault URL> + "/keys/" + <key name>, CancellationToken.None);
var policy = new BlobEncryptionPolicy(encryptionThingy, cloudResolver);
var options = new BlobRequestOptions() { EncryptionPolicy = policy };
await <ICloudBlob Instance>.DownloadToStreamAsync(<stream>, null, options, null);
So far with the new code I've gotten here:
var azureKeys = new KeyClient(new Uri(<key vault url>), new DefaultAzureCredential());
var encKey = azureKeys.GetKey(<key name>);
ClientSideEncryptionOptions encryptionOptions = new ClientSideEncryptionOptions(ClientSideEncryptionVersion.V1_0)
{
KeyEncryptionKey = (IKeyEncryptionKey)key
};
var bsClient = new BlobServiceClient(cStr, new SpecializedBlobClientOptions() { ClientSideEncryption = encryptionOptions });
var containerClient = new BlobContainerClient(cStr, containerName);
bClient = containerClient.GetBlobClient(<blob name>);
Of course this throws an exception because KeyVaultKey cannot be converted to IKeyEncryptionKey. So my questions are
Can the key be converted to an IKeyEncryptionKey easily, and how?
Can a key resolver be easily retrieved from the Azure SDKs, and how so?
I'm presuming there are ways to do this that don't involve creating our own implementations of the interfaces, but MS in their infinite wisdom didn't see fit to add those few lines to their documentation.
I write a simple demo for you. Just try the C# console app below about azure blob client-encryption with azure KeyVault:
using System;
using Azure.Identity;
using Azure.Security.KeyVault.Keys.Cryptography;
using Azure.Storage;
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Specialized;
namespace BlobEncyptionWithBlob
{
class Program
{
static void Main(string[] args)
{
string keyVaultName = "";
string keyName = "";
string kvUri = "https://" + keyVaultName + ".vault.azure.net/keys/" + keyName;
string storageConnStr = "";
string containerName = "";
string encyptBlob = "encypt.txt";
string localblobPath = #"C:\Users\Administrator\Desktop\123.txt";
string localblobPath2 = #"C:\Users\Administrator\Desktop\123-decode.txt";
//Below is to use recommended OAuth2 approach
//string clientID = "<OAuth Client ID>";
//string clientSecret = "<OAuth Secret>";
//string tenant = "<OAuth Tenant ID>";
//var cred = new ClientSecretCredential(tenant, clientID, clientSecret);
//This is what you use to directly replace older AppAuthentication
var cred = new DefaultAzureCredential();
CryptographyClient cryptoClient = new CryptographyClient(new Uri(kvUri), cred);
KeyResolver keyResolver = new KeyResolver(cred);
ClientSideEncryptionOptions encryptionOptions = new ClientSideEncryptionOptions(ClientSideEncryptionVersion.V1_0)
{
KeyEncryptionKey = cryptoClient,
KeyResolver = keyResolver,
KeyWrapAlgorithm = "RSA-OAEP"
};
BlobClientOptions options = new SpecializedBlobClientOptions() { ClientSideEncryption = encryptionOptions };
var blobClient = new BlobServiceClient(storageConnStr,options).GetBlobContainerClient(containerName).GetBlobClient(encyptBlob);
//upload local blob to container
blobClient.Upload(localblobPath);
//If you want to modify the meta data you have to copy the exisiting meta, think there is a bug in the library that will wipe out the encryptiondata metadata if you write your own meta
var myMeta = new Dictionary<string, string>();
myMeta.Add("comment", "dis file is da shiznit");
foreach (var existingMeta in blobClient.GetProperties().Value.Metadata)
{
if (!myMeta.ContainsKey(existingMeta.Key))
{
myMeta.Add(existingMeta.Key, existingMeta.Value);
}
}
blobClient.SetMetadata(myMeta);
//Download from container to see if it is decided
blobClient.DownloadTo(localblobPath2);
}
}
}
Result:
My local .txt file content:
Upload to blob and its content, it has been encrypted :
Download to local again and its content, it has been decoded:
I'm building a Web Api and I need to get data from a Google Analytics report.
I need to get data from a Google Analytics view.
But I think I'm facing an issue with the credentials.
Here is the code I'm using.
var filepath = "XXXXXXXXXXXXXXXXx";
var viewid = "XXXXXXXXXX";
GoogleCredential credentials;
using (var stream = new FileStream(filepath, FileMode.Open, FileAccess.Read))
{
string[] scopes = { AnalyticsReportingService.Scope.AnalyticsReadonly };
var googleCredential = GoogleCredential.FromStream(stream);
credentials = googleCredential.CreateScoped(scopes);
}
var reportingService = new AnalyticsReportingService(
new BaseClientService.Initializer
{
HttpClientInitializer = credentials
});
var dateRange = new DateRange
{
StartDate = "2018-06-01",
EndDate = "2018-06-25"
};
var sessions = new Metric
{
Expression = "ga:pageviews",
Alias = "Sessions"
};
var date = new Dimension { Name = "ga:date" };
var reportRequest = new ReportRequest
{
DateRanges = new List<DateRange> { dateRange },
Dimensions = new List<Dimension> { date },
Metrics = new List<Metric> { sessions },
ViewId = viewid
};
var getReportsRequest = new GetReportsRequest
{
ReportRequests = new List<ReportRequest> { reportRequest }
};
var batchRequest = reportingService.Reports.BatchGet(getReportsRequest);
var response = batchRequest.Execute();
foreach (var x in response.Reports.First().Data.Rows)
{
Console.WriteLine(string.Join(", ", x.Dimensions) + " " + string.Join(", ", x.Metrics.First().Values));
}
I'm getting the following issue:
Google.GoogleApiException: 'Google.Apis.Requests.RequestError
User does not have any Google Analytics account. [403]
Errors [
Message[User does not have any Google Analytics account.] Location[ - ]
Reason[forbidden] Domain[global]
Thanks,
Andrés
Make sure the credentials´ user has access to the Google Analytics account you are trying to access.
You must grant it access in the analytics page, just as you would for any other user.
Hello I have created a windows application which uploads image from hdd to google cloud server.
My code was working perfectly but after changing bucket name it is not working.
My both buckets are in the same project and I have given OAuth 2.0 to my project.
even there is no error showing while processing. Please help me.
string bucketForImage = ConfigurationManager.AppSettings["BucketName"];
string projectName = ConfigurationManager.AppSettings["ProjectName"];
string Accountemail = ConfigurationManager.AppSettings["Email"];
var clientSecrets = new ClientSecrets();
clientSecrets.ClientId = ConfigurationManager.AppSettings["ClientId"];
clientSecrets.ClientSecret = ConfigurationManager.AppSettings["ClientSecret"];
string gcpPath = #"D:\mrunal\tst_mrunal.png";
var scopes = new[] { #"https://www.googleapis.com/auth/devstorage.full_control" };
var cts = new CancellationTokenSource();
var userCredential = await GoogleWebAuthorizationBroker.AuthorizeAsync(clientSecrets, scopes, Accountemail, cts.Token);
var service = new Google.Apis.Storage.v1.StorageService();
var bucketToUpload = bucketForImage;
var newObject = new Google.Apis.Storage.v1.Data.Object()
{
Bucket = bucketToUpload,
Name = "mrunal.png"
};
fileStream = new FileStream(gcpPath, FileMode.Open);
var uploadRequest = new Google.Apis.Storage.v1.ObjectsResource.InsertMediaUpload(service, newObject,
bucketToUpload, fileStream, "image/png");
uploadRequest.OauthToken = userCredential.Token.AccessToken;
await uploadRequest.UploadAsync();
//uploadRequest.UploadAsync();
if (fileStream != null)
{
fileStream.Dispose();
}
Did you try this same code for the older bucket and it worked? It seems to me that there is an issue with line of code, uploadRequest.OauthToken = userCredential.Token.AccessToken. You are calling the Token.AccessToken directly from the userCredentials. These methods should be called from the userCredentials.Result.
I want to import data from Google Analytics.
var gas = new AnalyticsService(auth);
var r = gas.Data.Ga.Get("ga:6332XXXX", "2013-02-01", "2013-02-11", "ga:visits");
r.Dimensions = "ga:date";
r.Sort = "ga:visits";
r.StartIndex = 1;
var data = r.Fetch();
I get 400 bad request error in Fetch method. What is the wrong of my code?
My full code like following:
var scope = AnalyticsService.Scopes.AnalyticsReadonly.ToString();
var clientId = "--------.apps.googleusercontent.com";
var keyFile = #"C:\-----------------privatekey.p12";
var keyPassword = "notasecret";
var desc = GoogleAuthenticationServer.Description;
var key = new X509Certificate2(keyFile, keyPassword, X509KeyStorageFlags.Exportable);
var client = new AssertionFlowClient(desc, key)
{
ServiceAccountId = clientId,
Scope = scope
};
var auth = new OAuth2Authenticator<AssertionFlowClient>(client, AssertionFlowClient.GetState);
var gas = new AnalyticsService(auth);
var r = gas.Data.Ga.Get("ga:6332XXXX", "2013-02-01", "2013-02-11", "ga:visits");
r.Dimensions = "ga:date";
r.Sort = "ga:visits";
r.StartIndex = 1;
var data = r.Fetch();
Thaks for your interest.
A couple of ideas:
your clientId should be your SERVICE ACCOUNT EMAIL (not id)
your scope should be "https://www.googleapis.com/auth/analytics.readonly"
If this doesn't work, check #Martyn's answer