We have one product with 4 different prices(basic, middle etc). Let's say user bought basic package (payment session is created after payment completion subscription created),but after a while, user decided to change his package and upgrade to advanced package. In that case should i use update subscription ?
This actually updates from X$ to the Y$ package
var service = new SubscriptionService();
Subscription subscription = service.Get("sub_49ty4767H20z6a");
var items = new List<SubscriptionItemOptions> {
new SubscriptionItemOptions {
Id = subscription.Items.Data[0].Id,
Price = "price_CBb6IXqvTLXp3f",
},
};
var options = new SubscriptionUpdateOptions {
CancelAtPeriodEnd = false,
ProrationBehavior = "create_prorations",
Items = items,
};
subscription = service.Update("sub_49ty4767H20z6a", options);
or just cancel current subscription and create new one ? Why am i asking this, because i chatted with support guys from Stripe, and got two different answers. One of them told me that i need to use update_subscripton, the another one told that i need to attach new subscription. I am confused.
The way that you've described this here, the update to upgrade the subscription makes more sense to me. Pay careful attention to the intended proration behaviour (do you want to credit the time left on the current plan) and whether you want to shift the billing period.
I'd also point out that you should consider switching to separate Products. A Product is a single good/service that you sell, while Prices represent different billing intervals and currencies. If you imagine some service with Silver, Gold and Platinum access levels, each of those is a different product -- they get more features! The Product is what gets shown on the invoice, too. So unless you use separate Products, the invoices would be indistinguishable aside from the rate paid.
Most of the time doing what you've done doesn't present any specific obsdtacles, but the Customer Portal enforces one price per product per interval. If you had two annual prices for the "same product" but different amount, why would anybody pay the higher price? It's because they're paying for something different, a different product.
Quick Answer
Upgrade (From Basic to Pro)
Remove Basic
Add Pro
Prorate (Do not change billing cycle)
// subscriptionId -> your subscription id
// additionalStripePriceIds -> stripe price id you are going to add
// removalStripePriceIds -> stripe price id you are going to remove
var getService = new SubscriptionService();
var getServiceResult = await getService.GetAsync(subscriptionId);
var items = new List<SubscriptionItemOptions>();
if (removalStripePriceIds != null && removalStripePriceIds.Any())
{
foreach (var removalStripePriceId in removalStripePriceIds)
{
var item = getServiceResult.Items.Data.FirstOrDefault(i => i.Price.Id == removalStripePriceId);
if (item != null)
{
var subscriptionItemOption = new SubscriptionItemOptions
{
Id = item.Id,
Deleted = true,
};
items.Add(subscriptionItemOption);
}
}
}
foreach (var additionalStripePriceId in additionalStripePriceIds)
{
var subscriptionItemOption = new SubscriptionItemOptions
{
Price = additionalStripePriceId,
};
items.Add(subscriptionItemOption);
}
var updateService = new SubscriptionService();
var updateOptions = new SubscriptionUpdateOptions
{
CancelAtPeriodEnd = false,
ProrationBehavior = "create_prorations",
Items = items,
};
await updateService.UpdateAsync(subscriptionId, updateOptions);
Related
I have the method below in a webapi that pulls data.
I am building an app which will have a listview with default data coming from this method.
I want this data to be changing each time any user starts the app.
How can I generate random data with this method. There are about 4 different categories.
public IEnumerable<ArticlesDto> Find(string category)
{
IEnumerable<ArticlesDto> objArticles = null;
var context = new ArticlesContext();
objArticles = (from j in context.Information
where j.Category == category
select new ArticlesDto()
{
Id = j.Id,
Headlines = j.Headlines,
Url = j.Url,
Category = j.Category,
Summary = j.Summary
});
return objArticles;
}
Example: first time I use the app, I see a list of data about 20 rows(default data).
Second time I use it, I see a different list of another 20 rows different from the last time I used the app.
Why don't you try using AutoFixture. This framework would help you generate random data every time your WebAPI call is made. Here the GITHub link. Please mark as answer if this helps.
https://github.com/AutoFixture
Just OrderBy a random number and then take as many as you like:
Random rnd = new Random();
objArticles = context.Information.Where(i=> i.Category == category)
.OrderBy(i=> rnd.Next())
.Select(i=> new ArticlesDto
{
Id = i.Id,
Headlines = i.Headlines,
Url = i.Url,
Category = i.Category,
Summary = i.Summary
}).Take(20);
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I have an export job migrating data from an old database into a new database. The problem I'm having is that the user population is around 3 million and the job takes a very long time to complete (15+ hours). The machine I am using only has 1 processor so I'm not sure if threading is what I should be doing. Can someone help me optimize this code?
static void ExportFromLegacy()
{
var usersQuery = _oldDb.users.Where(x =>
x.status == 'active');
int BatchSize = 1000;
var errorCount = 0;
var successCount = 0;
var batchCount = 0;
// Using MoreLinq's Batch for sequences
// https://www.nuget.org/packages/MoreLinq.Source.MoreEnumerable.Batch
foreach (IEnumerable<users> batch in usersQuery.Batch(BatchSize))
{
Console.WriteLine(String.Format("Batch count at {0}", batchCount));
batchCount++;
foreach(var user in batch)
{
try
{
var userData = _oldDb.userData.Where(x =>
x.user_id == user.user_id).ToList();
if (userData.Count > 0)
{
// Insert into table
var newData = new newData()
{
UserId = user.user_id; // shortened code for brevity.
};
_db.newUserData.Add(newData);
_db.SaveChanges();
// Insert item(s) into table
foreach (var item in userData.items)
{
if (!_db.userDataItems.Any(x => x.id == item.id)
{
var item = new Item()
{
UserId = user.user_id, // shortened code for brevity.
DataId = newData.id // id from object created above
};
_db.userDataItems.Add(item);
}
_db.SaveChanges();
successCount++;
}
}
}
catch(Exception ex)
{
errorCount++;
Console.WriteLine(String.Format("Error saving changes for user_id: {0} at {1}.", user.user_id.ToString(), DateTime.Now));
Console.WriteLine("Message: " + ex.Message);
Console.WriteLine("InnerException: " + ex.InnerException);
}
}
}
Console.WriteLine(String.Format("End at {0}...", DateTime.Now));
Console.WriteLine(String.Format("Successful imports: {0} | Errors: {1}", successCount, errorCount));
Console.WriteLine(String.Format("Total running time: {0}", (exportStart - DateTime.Now).ToString(#"hh\:mm\:ss")));
}
Unfortunately, the major issue is the number of database round-trip.
You make a round-trip:
For every user, you retrieve user data by user id in the old database
For every user, you save user data in the new database
For every user, you save user data item in the new database
So if you say you have 3 million users, and every user has an average of 5 user data item, it mean you do at least 3m + 3m + 15m = 21 million database round-trip which is insane.
The only way to dramatically improve the performance is by reducing the number of database round-trip.
Batch - Retrieve user by id
You can quickly reduce the number of database round-trip by retrieving all user data at once and since you don't have to track them, use "AsNoTracking()" for even more performance gains.
var list = batch.Select(x => x.user_id).ToList();
var userDatas = _oldDb.userData
.AsNoTracking()
.Where(x => list.Contains(x.user_id))
.ToList();
foreach(var userData in userDatas)
{
....
}
You should already have saved a few hours only with this change.
Batch - Save Changes
Every time you save a user data or item, you perform a database round-trip.
Disclaimer: I'm the owner of the project Entity Framework Extensions
This library allows to perform:
BulkSaveChanges
BulkInsert
BulkUpdate
BulkDelete
BulkMerge
You can either call BulkSaveChanges at the end of the batch or create a list to insert and use directly BulkInsert instead for even more performance.
You will, however, have to use a relation to the newData instance instead of using the ID directly.
foreach (IEnumerable<users> batch in usersQuery.Batch(BatchSize))
{
// Retrieve all users for the batch at once.
var list = batch.Select(x => x.user_id).ToList();
var userDatas = _oldDb.userData
.AsNoTracking()
.Where(x => list.Contains(x.user_id))
.ToList();
// Create list used for BulkInsert
var newDatas = new List<newData>();
var newDataItems = new List<Item();
foreach(var userData in userDatas)
{
// newDatas.Add(newData);
// newDataItem.OwnerData = newData;
// newDataItems.Add(newDataItem);
}
_db.BulkInsert(newDatas);
_db.BulkInsert(newDataItems);
}
EDIT: Answer subquestion
One of the properties of a newDataItem, is the id of newData. (ex.
newDataItem.newDataId.) So newData would have to be saved first in
order to generate its id. How would I BulkInsert if there is a
dependency of an another object?
You must use instead navigation properties. By using navigation property, you will never have to specify parent id but set the parent object instance instead.
public class UserData
{
public int UserDataID { get; set; }
// ... properties ...
public List<UserDataItem> Items { get; set; }
}
public class UserDataItem
{
public int UserDataItemID { get; set; }
// ... properties ...
public UserData OwnerData { get; set; }
}
var userData = new UserData();
var userDataItem = new UserDataItem();
// Use navigation property to set the parent.
userDataItem.OwnerData = userData;
Tutorial: Configure One-to-Many Relationship
Also, I don't see a BulkSaveChanges in your example code. Would that
have to be called after all the BulkInserts?
Bulk Insert directly insert into the database. You don't have to specify "SaveChanges" or "BulkSaveChanges", once you invoke the method, it's done ;)
Here is an example using BulkSaveChanges:
foreach (IEnumerable<users> batch in usersQuery.Batch(BatchSize))
{
// Retrieve all users for the batch at once.
var list = batch.Select(x => x.user_id).ToList();
var userDatas = _oldDb.userData
.AsNoTracking()
.Where(x => list.Contains(x.user_id))
.ToList();
// Create list used for BulkInsert
var newDatas = new List<newData>();
var newDataItems = new List<Item();
foreach(var userData in userDatas)
{
// newDatas.Add(newData);
// newDataItem.OwnerData = newData;
// newDataItems.Add(newDataItem);
}
var context = new UserContext();
context.userDatas.AddRange(newDatas);
context.userDataItems.AddRange(newDataItems);
context.BulkSaveChanges();
}
BulkSaveChanges is slower than BulkInsert due to having to use some internal method from Entity Framework but still way faster than SaveChanges.
In the example, I create a new context for every batch to avoid memory issue and gain some performance. If you re-use the same context for all batchs, you will have millions of tracked entities in the ChangeTracker which is never a good idea.
Entity Framework is a very bad choice for importing large amounts of data. I know this from personal experience.
That being said, I found a few ways to optimize things when I tried to use it in the same way you are.
The Context will cache objects as you add them, and the more inserts you do, the slower future inserts will get. My solution was to limit each context to about 500 inserts before I disposed of that instance and created a new one. This boosted performance significantly.
I was able to make use of multiple threads to increase performance, but you will have to be very careful about resource contention. Each thread will definitely need its own Context, don't even think about trying to share it between threads. My machine had 8 cores, so threading will probably not help you as much; with a single core I doubt it will help you at all.
Turn off ChangeTracking with AutoDetectChangesEnabled = false;, change tracking is incredibly slow. Unfortunately this means you have to modify your code to make all changes directly through the context. No more Entity.Property = "Some Value";, it becomes Context.Entity(e=> e.Property).SetValue("Some Value"); (or something like that, I don't remember the exact syntax), which makes the code ugly.
Any queries you do should definitely use AsNoTracking.
With all that, I was able to cut a ~20 hour process down to about 6 hours, but I still don't recommend using EF for this. It was an extremely painful project due almost entirely to my poor choice of EF to add data. Please use something else... anything else...
I don't want to give the impression that EF is a bad data access library, it is great at what it was designed to do, unfortunately this is not what it was designed for.
I can think on a few options.
1) A little speed increase could be done by moving your _db.SaveChanges() under your foreach() close bracket
foreach (...){
}
successCount += _db.SaveChanges();
2) Add items to a list, and then to context
List<ObjClass> list = new List<ObjClass>();
foreach (...)
{
list.Add(new ObjClass() { ... });
}
_db.newUserData.AddRange(list);
successCount += _db.SaveChanges();
3) If it's a big amount of dada, save on bunches
List<ObjClass> list = new List<ObjClass>();
int cnt=0;
foreach (...)
{
list.Add(new ObjClass() { ... });
if (++cnt % 100 == 0) // bunches of 100
{
_db.newUserData.AddRange(list);
successCount += _db.SaveChanges();
list.Clear();
// Optional if a HUGE amount of data
if (cnt % 1000 == 0)
{
_db = new MyDbContext();
}
}
}
// Don't forget that!
_db.newUserData.AddRange(list);
successCount += _db.SaveChanges();
list.Clear();
4) If TOOOO big, considere using bulkinserts. There are a few examples on internet and a few free libraries around.
Ref: https://blogs.msdn.microsoft.com/nikhilsi/2008/06/11/bulk-insert-into-sql-from-c-app/
On most of these options you loose some control on error handling as it is difficult to know which one failed.
I have code to Create a Vendor Payment for aVendor Bill like this :
InitializeRecord ir = new InitializeRecord();
ir.type = InitializeType.vendorPayment;
InitializeRef iref = new InitializeRef();
iref.typeSpecified = true;
iref.type = InitializeRefType.vendorBill;
iref.internalId = vendorBillId;
ir.reference = iref;
Login();
ReadResponse getInitResp = _service.initialize(ir);
if (getInitResp.status.isSuccess)
{
Record rec = getInitResp.record;
((VendorPayment)rec).total = (double)amount; //I don't want to pall all, just pay a half or just an amount less than the total
((VendorPayment)rec).totalSpecified = true;
WriteResponse writeRes = _service.add(rec);
return writeRes.status;
}
That can create a payment but the total is not apply, the payment is pay all amount of vendor bill's total.
I don't know what I'm missing here.
While applying payments to bill you cannot change the body level amount field. you got to change the amount line level field on apply line item record. I am not sure on syntax in Suitetalk, but, that should work.
I have a program that bills a customer based on a custom entity. It is a subscription based process. We enter an order, if that order has a product that has been defined as a subscription product then the custom entity is created. The product can be either a single product or it can be a bundle. When the subscription ends I need to end the ability of the customer to continue to use the software.
I can read the product and I can determine if the product is a bundle by examining the attribute "productstructure". How do I determine which products are included in the bundle?
Thanks
If you retrieve the product you can use "productstructure" to determine if it is a bundle. If the value is a 3 it is a bundle.
You can then query the "productassociation" table where the attribute "productid" equals the Id of the bundle.
The attribute you need to retrieve from the "prodctassociation" table is "associatedproduct". You then retrieve the product instance.
QueryExpression productBundleQuery = new QueryExpression();
productBundleQuery.Distinct = false;
productBundleQuery.EntityName = "productassociation";
productBundleQuery.ColumnSet = new ColumnSet("associatedproduct");
productBundleQuery.Criteria = new FilterExpression
{
Conditions = { new ConditionExpression("productid", ConditionOperator.Equal, bundle.Id) }
};
EntityCollection productBundleCollection = _service.RetrieveMultiple(productBundleQuery);
foreach (Entity productAssociation in productBundleCollection.Entities)
{
Entity product = _service.Retrieve("product", ((EntityReference)productAssociation["associatedproduct"]).Id, new ColumnSet("name", ...));
Do something....
}
Hi i'm using magento soap api v2 with c#. Do far I have been calling
var groupedProducts = magentoService.catalogProductLinkList(sessionId, "grouped", id, "productId");
that does return grouped Products, but instead I would like to retrieve simple products such as green large t-shirt which is associated with configurable t-shirt.
How can this be achieved?
Unfortunatelly that's not possible with the magento SOAP api. You are not able to retrieve child products of a parent product via the api. Believe me, I have tackled this myself some time ago. I can suggest 2 fixes and 1 workaround.
Workaround - Try to retrieve child products by sku or name. This can work provided that all your child products use the parent's name or sku as the prefix. That's how I resolved it in the beginning and it worked well as long as the client did not introduce child product names that did not match the parent name. Here's some sample code:
//fetch configurable products
filters filter = new filters();
filter.filter = new associativeEntity[1];
filter.filter[0] = new associativeEntity();
filter.filter[0].key = "type_id";
filter.filter[0].value = "configurable";
//get all configurable products
var configurableProducts = service.catalogProductList(sessionID, filter, storeView);
foreach (var parent in configurableProducts)
{
filters filter = new filters();
filter.filter = new associativeEntity[1];
filter.filter[0] = new associativeEntity();
filter.filter[0].key = "type_id";
filter.filter[0].value = "configurable";
filter.complex_filter = new complexFilter[1];
filter.complex_filter[0] = new complexFilter();
filter.complex_filter[0].key = "sku";
filter.complex_filter[0].value = new associativeEntity() { key="LIKE", value=parent.sku + "%" };
var simpleProducts = service.catalogProductList(sessionID, filter, storeView);
//do whatever you need with the simple products
}
Fix #1 - Free - Write your own api extension. To retrieve child products you could use:
$childProducts = Mage::getModel('catalog/product_type_configurable')->getUsedProducts(null, $product);
You then send the results to the api caller and all should be well. I have not tried that myself, though, so I'm not sure if there are any other problems on the way.
Fix #2 - Paid - Get the (excellent, but pricey) CoreAPI extension from netzkollektiv. That's what I did when the workaround stopped working out for me and never regretted this decision.
I dont think what you are trying to do is possible with default magento SOAP api.
What you could do is create a custom api
eg.
How to setup custom api for Magento with SOAP V2?
http://www.magentocommerce.com/api/soap/create_your_own_api.html
Then create the logic to retrieve all the simple products associated with that configurable product.
$_product = Mage::getModel('catalog/product')->load($id);
// check if it's a configurable product
if($_product->isConfigurable()){
//load simple product ids
$ids = $_product->getTypeInstance()->getUsedProductIds();
OR
$ids = Mage::getResourceModel('catalog/product_type_configurable')->load($_product);
}
Please be aware that it should be
... new associativeEntity() { key="like", ...
and not
... new associativeEntity() { key="LIKE", ....
Upper case LIKE does not work.
I found a Magento Extension on Github, which includes this functionallity.
Get configurable's subproducts informations in the same response.
https://github.com/Yameveo/Yameveo_ProductInfo