[Market strategy] Limited amounts of GU-pairs

  • 1
  • Idea
  • Updated 7 years ago
As the GU-pairs in designs continue to be a problem area for us, I would like a strategy that try only to limit the amount used, as to avoid a whole design gets messed up because of too many GU-pairs.

I would like a strategy that punishes exponential.

For percentage of GU-pairs between 0 and 5% of the total basepairs in a design – give 0

For percentage of GU-pairs between 6 and 10% of the total basepairs in a design – give -1

For percentage of GU-pairs between 11and 15% of the total basepairs in a design – give -2

For percentage of GU-pairs between 16 and 20% of the total basepairs in a design – give -4

For percentage of GU-pairs between 21 and 25% of the total basepairs in a design – give -16
Photo of Eli Fisker

Eli Fisker

  • 2223 Posts
  • 484 Reply Likes

Posted 7 years ago

  • 1
Photo of boganis

boganis

  • 78 Posts
  • 4 Reply Likes
Why? A folding strategy based on excluding something that naturally occurs does not seem like a good solution, btw I found this paper heading interesting:
Consecutive Terminal GU Pairs Stabilize RNA Helices
http://pubs.acs.org/doi/abs/10.1021/b...
Photo of Eli Fisker

Eli Fisker

  • 2223 Posts
  • 484 Reply Likes
Hi Boganis!

Thanks for the article, i find the abstract very interesting, I will try getting hold of the full article. I really do believe this, as I have been wondering why some designs got away with a GU in terminal position (my post "Rulebreaking winners" and Quasispecies saying he had been seing this before in natural RNA too. The vienna energy model is set by default to penalize GU's (and AU's) as terminal of a string)

The reason for my strategy is because I think that designs with a very high GU-content is always a really bad idea and getting worse with the higher % of GU-pairs, even though GU-pairs are used in RNA-design by nature.

But you might got a point, that I making this strategy, start penalizing too early. (With 6% of GU-pairs and higher.) If this is the case I ask the devs/programmers, to raise our robots tolerance for GU-pairs a bit.

Thanks for your concern.
Photo of Eli Fisker

Eli Fisker

  • 2223 Posts
  • 484 Reply Likes
Actually, the thought of setting rules too strict for our little bot, makes me ask for first starting penalizing with 11% of GU-pairs. (-1) Exponential like before.
Photo of boganis

boganis

  • 78 Posts
  • 4 Reply Likes
Your last tuning proposal is a great improvement of the original suggestion :)
Photo of Eli Fisker

Eli Fisker

  • 2223 Posts
  • 484 Reply Likes
Thx, I was glad you mentioned your concern. :)
Photo of Jeehyung Lee

Jeehyung Lee, Alum

  • 708 Posts
  • 94 Reply Likes
Dear Eli,

Your strategy has been added to our implementation queue with task id 114. You can check the schedule of the implementation here.

Thanks for sharing your idea!

EteRNA team
Photo of stevetclark

stevetclark

  • 15 Posts
  • 1 Reply Like
Eli your strategy is similar to my strategy about too many gu pairs. My strategy is linear so I think your strategy will be more accurate.
Photo of Eli Fisker

Eli Fisker

  • 2223 Posts
  • 484 Reply Likes
Hi Steve!

Thanks for letting me know, I hadn't realized that.

It will be very interesting to see how they both will do. Especially if there are few GU-pairs in a design or if there are many. Then your strategy might end up being better for designs with few GU-pairs and mine if there is many. In that is the case, a combo would be good.