I have an abstract class that inherits from Table Entity.
public abstract class AzureEntityBase : TableEntity
{
public AzureEntityBase()
{
}
public virtual string TableName
{
get
{
return string.Empty;
}
}
private string ObjectHash { get; set; }
public bool IsBackedUp { get; set; }
}
I then have a class that implements this abstract class
public class DepartmentTotalEntity : AzureEntityBase
{
public override string TableName
{
get
{
return "DepartmentTotals";
}
}
public Int64 SessionDateTimeInteger { get; set; }
public string StoreID { get { return PartitionKey; } set { PartitionKey = value; } }
public string DetailKey { get; set; }
public string RangeString { get; set; }
public string DateStart { get; set; }
public string DateEnd { get; set; }
public Int64 DateStartInt { get; set; }
public Int64 DateEndInt { get; set; }
public string Dept_ID { get; set; }
public string DepartmentDescription { get; set; }
public decimal Quantity { get; set; }
public decimal TotalPrice { get; set; }
}
I am submitting a revised entity to Azure table storage with the IsBackedUp value set to false.
I then have a service that runs on an Azure Compute instance that copies the row from one table storage account to another. The other Azure table storage account is at a different Azure datacenter. After all the rows are copied, I want to limit what I grab when I copy the next round and the IsBackedUp field is supposed to do this.
I run a function that loops the rows already inserted, checks to see if they exist at the destination, if they do exist, then update the source table row to reflect that it was backed up and write that to azure via the following code.
foreach (DepartmentTotalEntity row in CopiedRows)
{
DepartmentTotalEntity find = dst.BrowseSingle(row.PartitionKey, row.RowKey);
if (find != null)
{
row.IsBackedUp = true;
int tmp = src.InsertOrReplace(row);
}
}
The return integer from the InsertOrReplace is the HttpStatusCode of the tableoperation and it always reads 204. This is expected for a successful write the ATS.
For completeness here is the InsertOrReplaceRow function.
public int InsertOrReplace(DepartmentTotalEntity input)
{
if (input.PartitionKey.IsNull())
{
throw new ArgumentNullException("PartitionKey");
}
if (input.RowKey.IsNull())
{
throw new ArgumentNullException("RowKey");
}
TableOperation replaceOperation = TableOperation.InsertOrReplace(input);
TableResult result = table.Execute(replaceOperation);
return result.HttpStatusCode;
}
The main problem is that the IsBackedUp field is not being updated when the InsertOrReplace command is being called in the third block of code.
Banging me head against a wall here trying to figure out why ATS will not accept my revision.
I can successfully change the value of IsBackedUp using Azure Table Storage Explorer. I have confirmed that the datatype of the column is Boolean.
Any help would be greatly appreciated. Let me know if I have posted enough of the code to be of assistance. The only class that is not posted is the rest of the class that surrounds the last code block. It is over 2000 lines so I omitted it for brevity. That class has the CloudTable, CloudTableClient and CloudStorageAccount variables.
Related
I am trying to use SQLite-Net Extensions to create a Relational Database. I'm running into an issue when trying to pull the Term object from the database. It successfully pulls over its associated courses, but not the courses associated assessments and notes. I'm not sure if the problem lies in how I insert the objects into the database, how I pull the objects from the database, or how I have the objects attributes listed.
I feel like the SQLite-Net Extensions documentation is extremely limited, so I'm not even sure what's going on. I've tried it many different ways, including adding CascadeOperations, but non of those seemed to help.
Here is the (simplified) code for my objects:
[Table("Terms")]
public class Term
{
[PrimaryKey, AutoIncrement]
public int ID { get; set; }
public string Name { get; set; }
[OneToMany]
public List<Course> Courses { get; set; }
public Term() { }
public Term(string name, List<Course> courses)
{
Name = name;
Courses = courses;
}
Courses
[Table("Courses")]
public class Course
{
[PrimaryKey, AutoIncrement]
public int ID { get; set; }
[ForeignKey(typeof(Term))]
public int TermID { get; set; }
public string Name { get; set; }
[OneToMany]
public List<Assessment> Assessments { get; set; }
[OneToMany]
public List<Note> Notes { get; set; }
public Course() { }
public Course(string name, List<Assessment> assessments, List<Note> notes)
{
Name = name;
Assessments = assessments;
Notes = notes;
}
}
Assessments
[Table("Assessments")]
public class Assessment
{
[PrimaryKey, AutoIncrement]
public int ID { get; set; }
[ForeignKey(typeof(Course))]
public int CourseID { get; set; }
public string Name { get; set; }
public Assessment() { }
public Assessment(string name)
{
Name = name;
}
}
Notes
[Table("Notes")]
public class Note
{
[PrimaryKey, AutoIncrement]
public int ID { get; set; }
[ForeignKey(typeof(Course))]
public int CourseID { get; set; }
public string Name { get; set; }
public string Text { get; set; }
public Note() { }
public Note(string name, string note)
{
Name = name;
Text = note;
}
}
And here is the code for inserting and getting objects:
Inserting
public bool SaveTermAsync(Term term)
{
if (term.ID != 0)
{
_database.UpdateWithChildrenAsync(term);
return true;
}
else
{
foreach (var course in term.Courses)
{
foreach (var assessment in course.Assessments)
{
_database.InsertAsync(assessment);
}
foreach (var note in course.Notes)
{
_database.InsertAsync(note);
}
_database.InsertAsync(course);
}
_database.InsertAsync(term);
_database.UpdateWithChildrenAsync(term);
return false;
}
}
Getting
public Task<List<Term>> GetTermsAsync()
{
return _database.GetAllWithChildrenAsync<Term>();
}
I know it's a bit of a code dump, but I have no idea where or what could be going wrong. If anyone could give any information about what is potentially going wrong, that would be awesome. Perhaps I'm simply expecting something to happen that isn't actually how it works. I don't know.
Also, if anyone has any links to some better documentation than https://bitbucket.org/twincoders/sqlite-net-extensions/src/master/ that would be awesome
EDIT
I tried using Cascading Options as well, CascadeRead, CascadeInsert, and CascadeAll. Using CascadeInsert or CascadeAll with _database.InsertWithChildrenAsync(term, true) resulted in a crash. The crash does not provide any error messages, and even wrapping the InsertWithChildren with a try catch block didn't work. Removing the recursive bool caused the program not to crash, and actually get the closest to what I'm looking for. Assessments and Notes are no longer null, but are still empty. Here's my updated code:
Saving and Getting:
public async Task<List<Term>> GetTermsAsync()
{
return await _database.GetAllWithChildrenAsync<Term>(recursive: true);
}
public async void SaveTermAsync(Term term)
{
if (term.ID != 0)
{
await _database.UpdateWithChildrenAsync(term);
}
else
{
//Trying this with recursion results in crash
await _database.InsertWithChildrenAsync(term);
}
}
One-To-Many Relationships:
//In Term
[OneToMany(CascadeOperations = CascadeOperation.All)]
public List<Course> Courses { get; set; }
//In Courses
[OneToMany(CascadeOperations = CascadeOperation.All)]
public List<Assessment> Assessments { get; set; }
[OneToMany(CascadeOperations = CascadeOperation.All)]
public List<Note> Notes { get; set; }
Also, I forgot to include last time how I'm populating the tables in the first place.
public bool CreateTables()
{
_database.CreateTableAsync<Term>().Wait();
_database.CreateTableAsync<Course>().Wait();
_database.CreateTableAsync<Assessment>().Wait();
_database.CreateTableAsync<Note>().Wait();
return true;
}
public Task<int> ClearTablesTest()
{
_database.DropTableAsync<Term>();
_database.DropTableAsync<Course>();
_database.DropTableAsync<Assessment>();
return _database.DropTableAsync<Note>();
}
async public Task<int> PopulateTestData()
{
await ClearTablesTest();
CreateTables();
Term term = new Term("Test Term", true, DateTime.Now, DateTime.Now.AddDays(10),
new List<Course>
{
new Course("Test Course", CourseStatus.Completed, "Guys Name", "(999)-999-9999", "email#gmail.com", 6, DateTime.Now, DateTime.Now.AddDays(10),
new List<Assessment>
{
new Assessment("Test Assessment", AssessmentType.Objective, false, DateTime.Now, DateTime.Now.AddDays(10))
},
new List<Note>
{
new Note("Test Note", "This is a test note.")
})
});
App.Database.SaveTermAsync(term);
return 0;
}
I finally figured out what was causing the crash as well as causing general confusion within SQLite-Net Extensions.
In my Assessment class, the property
public string BackgroundColor
{
get { return IsComplete ? "#558f45" : "Gray"; }
set { BackgroundColor = value; }
}
was causing the crash when recursion was used. I've been scouring the web for over two weeks looking for solutions to this issue, but haven't found anything similar to this. I submitted a bug report on the SQLite-Net Extensions bitbucket.
If anyone knows why this specific line would cause issues, I'd love to hear your input. Until then I'm going to mark this question as answered and continue work on my app.
Thanks #redent84 for your help thus far on this issue.
I followed the steps described on this tutorial. My case is a little bit different:
Instead of indexing Hotels and Rooms, I am indexing Candidates and Resumes.
Instead of using CosmosDB I am using an Azure SQL Database.
Following the tutorial, I am able to create the Index, the 2 Indexers (one for the SQL DB and one for the Blobs storage), and the 2 data sources.
The SQL DB contains all my candidates, and the storage contains all their resumes (files with PDF/DOC/DOCX formats). Each blob has a metadata "ResumeCandidateId" that contains the same value as the "CandidateId" for the Candidate.
I have the following fields for my Index:
[SerializePropertyNamesAsCamelCase]
public partial class Candidate
{
[Key]
[IsFilterable, IsRetrievable(true), IsSearchable]
public string CandidateId { get; set; }
[IsFilterable, IsRetrievable(true), IsSearchable, IsSortable]
public string LastName { get; set; }
[IsFilterable, IsRetrievable(true), IsSearchable, IsSortable]
public string FirstName { get; set; }
[IsFilterable, IsRetrievable(true), IsSearchable, IsSortable]
public string Notes { get; set; }
public ResumeBlob[] ResumeBlobs { get; set; }
}
[SerializePropertyNamesAsCamelCase]
public class ResumeBlob
{
[IsRetrievable(true), IsSearchable]
[Analyzer(AnalyzerName.AsString.StandardLucene)]
public string content { get; set; }
[IsRetrievable(true)]
public string metadata_storage_content_type { get; set; }
public long metadata_storage_size { get; set; }
public DateTime metadata_storage_last_modified { get; set; }
public string metadata_storage_name { get; set; }
[Key]
[IsRetrievable(true)]
public string metadata_storage_path { get; set; }
[IsRetrievable(true)]
public string metadata_content_type { get; set; }
public string metadata_author { get; set; }
public DateTime metadata_creation_date { get; set; }
public DateTime metadata_last_modified { get; set; }
public string ResumeCandidateId { get; set; }
}
As you can see, one Candidate can have multiple Resumes. The challenge is to populate the ResumeBlobs property...
The data from the SQL DB is indexed and mapped correctly by the Indexer. When I run the Blobs Indexer, it loads documents, however it does not map them and they never show up in the search (ResumeBlobs is always empty). Here is the code used to create the Blobs Indexer:
var blobDataSource = DataSource.AzureBlobStorage(
name: "azure-blob-test02",
storageConnectionString: "DefaultEndpointsProtocol=https;AccountName=yyy;AccountKey=xxx;EndpointSuffix=core.windows.net",
containerName: "2019");
await searchService.DataSources.CreateOrUpdateAsync(blobDataSource);
List<FieldMapping> map = new List<FieldMapping> {
new FieldMapping("ResumeCandidateId", "CandidateId")
};
Indexer blobIndexer = new Indexer(
name: "hotel-rooms-blobs-indexer",
dataSourceName: blobDataSource.Name,
targetIndexName: indexName,
fieldMappings: map,
//parameters: new IndexingParameters().SetBlobExtractionMode(BlobExtractionMode.ContentAndMetadata).IndexFileNameExtensions(".DOC", ".DOCX", ".PDF", ".HTML", ".HTM"),
schedule: new IndexingSchedule(TimeSpan.FromDays(1)));
bool exists = await searchService.Indexers.ExistsAsync(blobIndexer.Name);
if (exists)
{
await searchService.Indexers.ResetAsync(blobIndexer.Name);
}
await searchService.Indexers.CreateOrUpdateAsync(blobIndexer);
try
{
await searchService.Indexers.RunAsync(blobIndexer.Name);
}
catch (CloudException e) when (e.Response.StatusCode == (HttpStatusCode)429)
{
Console.WriteLine("Failed to run indexer: {0}", e.Response.Content);
}
I commented the parameters for the blobIndexer but I get the same results even if it's not commented.
When I run a search, here is an example of what I get:
{
"#odata.context": "https://yyy.search.windows.net/indexes('index-test01')/$metadata#docs(*)",
"value": [
{
"#search.score": 1.2127206,
"candidateId": "363933d1-7e81-4ed2-b82e-d7496d98db50",
"lastName": "LAMLAST",
"firstName": "ZFIRST",
"notes": "MGA ; SQL ; T-SQL",
"resumeBlobs": []
}
]
}
"resumeBlobs" is empty. Any idea how to do such a mapping?
AFAIK, Azure Search doesn't support a collection merge feature that seems to be necessary to implement your scenario.
An alternative approach to this is to create a separate index for resumes and point the resume indexer to that index. That means that some of your search scenarios will have to hit two indexes, but it's a path forward.
I have .net 4.5.2 test app playing about with Azure Mobile Services and I'm attempting to store data using the TableController. I have my data types as follows:
public class Run:EntityData
{
public int RunId { get; set; }
public DateTime? ActivityStarted { get; set; }
public DateTime? ActivityCompleted { get; set; }
public List<Lap> LapInformation { get; set; }
public Run()
{
LapInformation = new List<Lap>();
}
}
public class Lap
{
[Key]
public int LapNumber { get; set; }
public int CaloriesBurnt { get; set; }
public double Distance {get; set;}
//Some other basic fields in here
public DateTime? LapActivityStarted { get; set; }
public DateTime? LapActivityCompleted { get; set; }
public Lap()
{
}
In my Startup class I call:
HttpConfiguration config = new HttpConfiguration();
new MobileAppConfiguration()
.UseDefaultConfiguration()
.ApplyTo(config);
And in my MobileServiceContext class:
public class MobileServiceContext : DbContext
{
private const string connectionStringName = "Name=MS_TableConnectionString2";
public MobileServiceContext() : base(connectionStringName)
{
}
public DbSet<Run> Runs { get; set; }
public DbSet<Lap> Laps { get; set; }
protected override void OnModelCreating(DbModelBuilder modelBuilder)
{
modelBuilder.Conventions.Add(
new AttributeToColumnAnnotationConvention<TableColumnAttribute, string>(
"ServiceTableColumn", (property, attributes) => attributes.Single().ColumnType.ToString()));
}
}
In my controller then, I have:
[MobileAppController]
public class RunController: TableController<Run>
{
protected override void Initialize(HttpControllerContext controllerContext)
{
base.Initialize(controllerContext);
MobileServiceContext context = new MobileServiceContext();
DomainManager = new EntityDomainManager<Run>(context, Request);
}
public IList<Run> GetAllRuns()
{
var runs = context.Runs.Include("LapInformation").ToList();
return runs;
}
public SingleResult<Run> GetRun(string id)
{
return Lookup(id);
}
public async Task<IHttpActionResult> PostRun(Run run)
{
Run current = await InsertAsync(run);
return CreatedAtRoute("Tables", new { id = current.Id }, current);
}
public Task DeleteRun(string id)
{
return DeleteAsync(id);
}
}
I can then POST a record in fiddler which responds with a 201 and the Location of the newly created Item. An Example of the data I'm posting is:
{RunId: 1234, LapInformation:[{LapNumber:1,Distance:0.8, LapActivityStarted: "2017-06-19T00:00:00", LapActivityCompleted: "2017-06-19T00:00:00", CaloriesBurnt: 12}]}
However, when I GET that object, I'm only getting the fields from Run, without the list of Detail records (Lap). Is there anything I have to configure in Entity Framework so that when I GET a Run record from the DB, it also gets and deserializes all associated detail records?
Hopefully that makes sense.
EDIT
Turns out that it is pulling back all the lap information, but when I return it to the client, that information is getting lost.
You can use custom EF query with Include() method instead of Lookup call preferably overload that takes function from System.Data.Entity namespace.
var runs = context.Runs.Include(r => r.LapInformation)
Take a look at https://msdn.microsoft.com/en-us/library/jj574232(v=vs.113).aspx
AFAIK, you could also use the $expand parameter to expand your collections as follows:
GET /tables/Run$expand=LapInformation
Here is my sample, you could refer to it:
You could mark your action with a custom ActionFilterAttribute for automatically adding the $expand property to your query request as follows:
// GET tables/TodoItem
[ExpandProperty("Tags")]
public IQueryable<TodoItem> GetAllTodoItems()
{
return Query();
}
For more details, you could refer to adrian hall's book chapter3 relationships.
EDIT Turns out that it is pulling back all the lap information, but when I return it to the client, that information is getting lost.
I defined the following models in my mobile client:
public class TodoItem
{
public string Id { get; set; }
public string UserId { get; set; }
public string Text { get; set; }
public List<Tag> Tags { get; set; }
}
public class Tag
{
public string Id { get; set; }
public string TagName { get; set; }
}
After execute the following pull operation, I could retrieve the tags as follows:
await todoTable.PullAsync("todoItems", todoTable.CreateQuery());
Note: The Tags data is read-only, you could only update the information in the ToDoItem table.
Additionally, as adrian hall mentioned in Data Access and Offline Sync - The Domain Manager:
I prefer handling tables individually and handling relationship management on the mobile client manually. This causes more code on the mobile client but makes the server much simpler by avoiding most of the complexity of relationships.
I am sorry if it has already been answered but I can't find any solution. Here is my (little) problem. Also all my apologies if the terms I use are approximate, I am far from being a skilled C# developer
Note that I think my problem is similar to this one Entity Framework validation error for missing field, but it's not missing?
I have a table "Tweets" with a tweet_id field (bigint) which is my primary key.
I use the following class to load the table :
class TwitterDbContext : DbContext
{
public TwitterDbContext() : base("Twitter")
{
}
public DbSet<Stream> Streams { get; set; }
public DbSet<StreamParameter> StreamParameters { get; set; }
public DbSet<Tweet> Tweets { get; set; }
}
public class Tweet
{
public Tweet()
{
}
[Key]
public long tweet_id { get; set; }
public string tweet { get; set; }
public long creator { get; set; }
public double latitude { get; set; }
public double longitude { get; set; }
public string language { get; set; }
public DateTime created_at { get; set; }
public DateTime registered_at { get; set; }
public long? in_reply_to { get; set; }
public bool retweeted { get; set; }
}
I have an other class to store within the code execution all the fields used by the Tweet table. For the need here, let's imagine I manually create it that way
private void Test_Button_Click(object sender, RoutedEventArgs e)
{
Twts twtReceived = new Twts();
twtReceived.tweet_id = 1;
twtReceived.tweet = "test";
twtReceived.creator = 1;
twtReceived.latitude = -1;
twtReceived.longitude = -1;
twtReceived.language = "a";
twtReceived.created_at = DateTime.Now;
twtReceived.registered_at = DateTime.Now;
twtReceived.in_reply_to = 1;
twtReceived.retweeted = true;
AddTweet(twtReceived);
}
Now here is the AddTweet method
static public void AddTweet(Twts twtReceived)
{
try
{
// update the tweet data in the database
using (var TwitterDb = new TwitterDbContext())
{
Tweet twt = new Tweet()
{
tweet_id = twtReceived.tweet_id,
tweet = twtReceived.tweet,
creator = twtReceived.creator,
longitude = twtReceived.longitude,
latitude = twtReceived.latitude,
language = twtReceived.language,
created_at = twtReceived.created_at,
registered_at = twtReceived.registered_at,
in_reply_to = twtReceived.in_reply_to,
retweeted = twtReceived.retweeted
};
TwitterDb.Tweets.Add(twt);
TwitterDb.SaveChanges();
}
}
catch(Exception ex)
{
MessageBox.Show(ex.InnerException.ToString());
}
}
I constantly have the same error message:
Cannot insert the value NULL into column 'tweet_id', table
'Twitter.dbo.Tweets'; column does not allow nulls. INSERT fails.
The thing is that when I spy on "TwitterDb.Tweets.Local" after TwitterDb.Tweets.Add(twt); I correctly have tweet_id set to 1.
Any idea where is the issue?
Try marking your tweet_id field with following (instead of just [Key]), if this is a primary key column where you want to provide values yourself
[Required, Key, DatabaseGenerated(DatabaseGeneratedOption.None)]
If it is an auto-increment, then remove explicit assignments to this field and mark it as 'Identity' instead:
[Required, Key, DatabaseGenerated(DatabaseGeneratedOption.Identity)]
How to get this to work?
GSD a class that is used to store cache images of SQL tables. GSD has several public static properties representing different "CacheTables", which are Dictionary=long,rowtypeclass= objects, each with a different rowtype class. The rowtype class objects model SQL table rows.
public class GSDataObject
{
private Dictionary<long, GRPListRow> prvGRPList;
private Dictionary<long, TestTableRow> prvTestTable;
//=======================================
public Dictionary<long, GRPListRow> GRPList
{
get { return prvGRPList;}
set { prvGRPList = value; }
}
//=====================================
public Dictionary<long, TestTableRow> TestTable
{
get { return prvTestTable; }
set { prvTestTable = value; }
}
public class TestTableRow{
public int ID { get; set; }
public string Field1 { get; set; }
public string Field2 { get; set; }
public string Field3 { get; set; }
public string Field4 { get; set; }
public string Field5 { get; set; }
public string Field6 { get; set; }
public string Field7 { get; set; }
public string Field8 { get; set; }
}
GSD and its different CacheTable properties work fine when declared hard-coded; I want to access them with reflection.
Specifically, I want to get a particular Row from a particular CacheTable in an instance of GSD, update that row, and then put it back. The instructions below describe the "get the row" phase.
The first three instructions work, and the resulting wrkCacheTableObject is of the correct type Dictionary=long,wrkRowtype=. However, wrkCacheTableObject is not indexed, so I can't retrieve rows from it.
wrkGSD is an instantiation of a class GSD. wrkCacheTableName is a string name of the particular CacheTable property. wrkRowType is the string class name of the row type.
wrkRow = Activator.CreateInstance(wrkRowType);
PropertyInfo wrkTablePropInfo = wrkGSDOType.GetProperty(wrkCacheTableName);
object wrkCacheTableObject = wrkTablePropInfo.GetValue(wrkGSD, null); // <== gives correct CacheTable instance
wrkTableDictObject = (Dictionary<long, object>)wrkCacheTableObject; //<=== fails here
wrkRow = wrkTableDictObject[wrkRowID];
// update wrkRow fields using reflection //<== this works if I retrieve wrkRow via hard code
// put it back into wrkTableDictObject
// put wrkTableDictObject back into wrkGSD
I'm not fixed on this particular set of instructions. And maybe if I can get the first phase above to work, it will show me how to do the other phases.
Found the answer via Experts Exchange:
dynamic wrkCacheTableObject = wrkTablePropInfo.GetValue(wrkGSD, null);
//--- get the row using dynamic
dynamic wrkRow = wrkCacheTableObject[(long)varAJR.rowID];
//--- put the row back
wrkCacheTableObject[(long)varAJR.rowID]= wrkRow;