1. What Is Unique about Our Application of Modification of the Penrose Cyclic Cosmology Theory?
We review the modification of the Penrose cyclic conformal cosmology paradigm given in  and include in a partition function given in  , as a way to include in the modified Heinsenberg Uncertainty principle,  as a way to ascertain the role of the inflaton, as we write it up in using Padmanablan’s reference  .
Modification of the HUP and included in our representation of the inflaton, in the partition function will then lead to, after we are including in the results from  a way to discuss how to get a uniform value for Planck’s constant, , in, per cycle creation of new universes.
On the contrary, the supposition is given by Susskind and others,  as to up to 10100 universes, with only say 106 of them surviving due to sufficiently “robust” cosmological values, for stable physical law. The end result is that what we would have instead is a “multiverse” which is dynamic and stable over time. And so we review our present modification of the Penrose cyclic conformal cosmology model to take into account multiple universes.
2. Extending Penrose’s Suggestion of Cyclic Universes, Black Hole Evaporation, and the Embedding Structure Our Universe Is Contained within. This Multiverse Embeds BHs and May Resolve What Appears to Be an Impossible Dichotomy
That there are no fewer than N universes undergoing Penrose “infinite expansion”  contained in a mega universe structure. Furthermore, each of the N universes has black hole evaporation, with the Hawking radiation from decaying black holes. If each of the N universes is defined by a partition function, called, then there exist an information ensemble of mixed minimum information correlated as about 107 - 108 bits of information per partition function in the set, so minimum information is conserved between a set of partition functions per universe. We are when following this using the notation of  while noting that there is a subsequent alteration of the notation used for partition functions.
However, there is non-uniqueness of information put into each partition function. Furthermore Hawking radiation from the black holes is collated via a strange attractor collection in the mega universe structure to form a new big bang for each of the N universes represented by. Verification of this mega structure compression and expansion of information with a non-uniqueness of information placed in each of the N universes favors ergodic mixing treatments of initial values for each of N universes expanding from a singularity beginning. The value, will be using (Ng, 2008)  . How to tie in this energy expression, as in Equation (1) will be to look at the formation of a nontrivial gravitational measure as a new big bang for each of the N universes as by. The density of states at a given energy for a partition function (Poplawski, 2011)  .
Each of identified with Equation (2) above, are with the iteration for N universes  . Then the following holds, namely, from  .
For N number of universes, with each for j = 1 to N being the partition function of each universe just before the blend into the RHS of Equation (3) above for our present universe. Also, each of the independent universes given by are constructed by the absorption of one to ten million black holes taking in energy. i.e.  . Furthermore, the main point is similar to what was done in terms of general ergodic mixing.
What is done in Claim 1 and Claim 2 is to come up with a protocol as to how a multi dimensional representation of black hole physics enables continual mixing of spacetime largely as a way to avoid the Anthropic principle, as to a preferred set of initial conditions. How can a graviton with a wavelength 10−4 the size of the universe interact with a Kere black hole, spatially. Embedding the BH in a multiverse setting may be the only way out.
Claim 1 is particularly important. The idea here is to use what is known as CCC cosmology, which can be thought of as the following.
First. Have a big bang (initial expansion) for the universe. After redshift z = 10, a billion years ago, SMBH formation starts. Matter-energy is vacuumed up by the SMBHs, which at a much later date than today (present era) gather up all the matter-energy of the universe and recycles it in a cyclic conformal translation, as follows, namely
c1 is, here a constant. Then we have that for consistency in our presentation that the main methodology in the Penrose proposal has been shown in Equation (6) where we are evaluating a change in the metric by a conformal mapping to
Penrose’s suggestion has been to utilize the following 
The infall into cosmic black hopes has been the main mechanism which the author asserts would be useful for the recycling apparent in Equation (8) above with the caveat that is kept constant from cycle to cycle as represented by
Equation (9) is to be generalized, as given by a weighing averaging as given by Equation (3). where the averaging is collated over perhaps thousands of universes, call that number N, with an ergotic mixing of all these universes, with the ergodic mixing represented by Equation (3) to generalize Equation (9) from cycle to cycle.
3. Now for the Mixing Being Put in, and Birkhoff’s Ergodic Mixing Theorem
We will, afterwards, do the particulars of the partition function. But before that, we will do the “mixing” of inputs into the Partition function of the Universe, i.e. an elaboration on Equation (3) above. To do this, first, look at the following, from  .
Birkhoff’s Ergodic mixing theorem:
Let be a measure preserving transformation of a probability space
Then for almost every the following time average exists
In the end, we need to have a way to present how the bona fides of Equation (9) can be established, and the averaging of both Equation (10) and Equation (4) above need to be put to a consistent general treatment for an invariant for cycle to cycle, of cosmological creation.
To do this, we also refer to the generalized treatment of, from 
Having said, that, the remaining constraint is to come up with a suitably averaged value of the Partition function in the above work. Our averaging eventually will have to be reconciled with the Birkhoff Ergodic Mixing theorem.
4. How to Average out the Planck’s Constant, Using Partition Function Given in Equation (11)
We begin with what is given in Shankar’s treatment of the partition function of  as given by
Using, for Pre Planckian space-time the approximation of 
Approximate using Beckwith’s treatment of the HUP, in Pre Planckian space-time 
Put in now the value of the inflaton given by Padmanbhan,  as for
Put in the value for the inflaton as given in Equation (15) into the partition function of Equation (14)
Then using Shankar,  , we will take the result so Equation (16) and then from there, use them in Equation (15) and also Equation (14). The end result is a massive cancellation of the terms, as to obtain Equation (17) below
Then by use of Equation (11) we obtain
This is the baseline of the constraint which will make Planck’s constant, a constant per universe creation cycle. As given by Equation (9). i.e. Equation (9) is confirmed by Equation (18). We will next then go to how this ties into Equation (10) above, via use of averaging is affecting the choice of the inputs into Equation (18) above. Doing this will allow investigation as to how to falsify the Birkhoff Ergodic mixing theorem as mentioned next.
5. Applying the Birkoff Ergodic Averaging Equation (10) to the Inputs into Equation (18)
To do this, we specifically look at the wavelength, namely, applying  to a wavelength. One over the wavelength is proportional to frequency, so if we have the wave length, as represented by the following situation, with invariance set in stone. Here we are assuming that the formulation is, as follows: With N the number of recycled “universes”
i.e. the averaging by the Burkhoff theorem implies that there is a critical invariance. And this invariance should be linked, then, to the diameter of a nonsingular bounce point. A nonsingular bounce, i.e., beginning of an expansion of a new universe is the main point of  . Furthermore, we have that if we look at applying the insights of   we obtain
We will try to show, in a later date that these are invariant per cycle, but the upshot is that if there is a natural fit, as to Equation (19) and if is fixed as an invariant per cycle, given by 19, then the invariance of per cycle is then maintained.
6. Conclusion: Implications of the Invariance of h: Uniform Physics Laws per Universe, and Not the 101000 Created Universes with Only Say 1010 Surviving through a Cosmic Cycle
In a word this demolishes the program of the cosmic landscape of string theory  , and gives credence to the possibility of an invariant multiverse, which would not be collapsing.
If this is confirmed, experimentally, it will do much to reduce what has been at times a post modern fragmentation of basic physics inquiry and to have physics, with a uniform set of laws, regardless of whether there were many worlds, or just one, in terms of one universe, or many universes, and as well as allow investigation of the information theory approach of  to event horizons and early universe cosmology. How  could influence a choice of partition functions is given in this paper’s Appendix.
This work is supported in part by National Nature Science Foundation of China grant No. 11375279.
Appendix: Highlights of J.-W. Lee’s Paper 
The following formulation is to highlight how entropy generation blends in with quantum mechanics, and how the breakdown of some of the assumptions used in Lee’s paper coincide with the growth of degrees of freedom. What is crucial to Lee’s formulation, is Rindler geometry, not the curved space formulation of initial universe conditions. First of all,  (Lee, 2010),
“Considering all these recent developments, it is plausible that quantum mechanics and gravity has information as a common ingredient, and information is the key to explain the strange connection between two. If gravity and Newton mechanics can be derived by considering information at Rindler horizons, it is natural to think quantum mechanics might have a similar origin. In this paper, along this line, it is suggested that quantum field theory (QFT) and quantum mechanics can be obtained from information theory applied to causal (Rindler) horizons, and that quantum randomness arises from information blocking by the horizons.”
To start this we look at the Rindler partition function, as by  (Lee, 2010)
As stated by Lee  , we expect to be equal to the quantum mechanical partition function of a particle with mass m in Minkowski space time. Furthermore, there exists the datum that: Lee made an equivalence between Equation (A1) and  (Lee, 2010)
where is the action “integral” for each path, leading to a wave function for each path?
If we do a rescale, then the above wave equation can lead to a Schrodinger equation.
The example given by (Lee, 2010) is that there is a Hamiltonian for which
Here, V is a potential, and can have arbitrary values before measurement, and to a degree, Z represent uncertainty in measurement. In Rindler co-ordinates, , in co-ordinates with proper time variance then
Here, the is a plane orthogonal to the plane. If so then
Now, for the above situation, the following are equivalent.
1) Thermal partition function is from information loss about field beyond the Rindler Horizon.
2) QFT formation is equivalent to purely information based statistical treatment suggested in this paper.
3) QM emerges from information theory emerging from Rindler co-ordinate.
Lee also forms a Euclidian version for the following partition function, if is the Euclidian action for the scalar field in the initial frame. i.e.
There exist analytic continuation of leading to Usual zero temperature QM partition function of for fields.
Important Claim: The following are equivalent.
1) and are obtained by analytic continuation from.
2) and are equivalent.
Submit or recommend next manuscript to SCIRP and we will provide best service for you:
Accepting pre-submission inquiries through Email, Facebook, LinkedIn, Twitter, etc.
A wide selection of journals (inclusive of 9 subjects, more than 200 journals)
Providing 24-hour high-quality service
User-friendly online submission system
Fair and swift peer-review system
Efficient typesetting and proofreading procedure
Display of the result of downloads and visits, as well as the number of cited articles
Maximum dissemination of your research work
Submit your manuscript at: http://papersubmission.scirp.org/