Andreas Wagner
We propose a traditional simple of the vertex variable based math with regards to old style integrable field hypotheses. We utilize this principal idea to depict the helper capacity of the direct assistant issue as an old style vertex administrator. At that point utilizing the fundamental variable based math fulfilled by the helper work along with the straight assistant issue we distinguish the neighborhood integrals of movement, which by development are in involution. The time segments of the lax pair are likewise distinguished as far as the traditional vertex administrators. Frameworks within the sight of point like deformities just as frameworks on the semi-boundless line are explored. Explicit models related to the traditional Yangian and bent Yangian is additionally introduced
Andreas Wagner
We propose a traditional simple of the vertex polynomial math with regards to old style integrable field speculations. We utilize this principal thought to portray the assistant capacity of the direct helper issue as a traditional vertex administrator. At that point utilizing the hidden polynomial math fulfilled by the helper work along with the direct assistant issue we distinguish the neighbourhood integrals of movement, which by development are in involution. The time segments of the lax pair are additionally recognized as far as the traditional vertex administrators. Frameworks within the sight of point like imperfections just as frameworks on the semi-endless line are researched. Explicit models related to the traditional Yangian and curved Yangian are additionally introduced.
Andreas Wagner
These paper audits parts of VLSI creation innovation and the related plan and plan mechanization methods to make way for the introduction of those papers in actual plan and approval that were chosen as the most persuasive DATE distributions over the most recent 10 years.
Vahid Satarifard
A mathematical space may be a set furnished with a style, referred to as a geographics, which allows characterizing uninterrupted misshaping of sub-spaces, and, all the additional by and enormous, a good vary of progression. Euclidian areas, and, all the additional by and enormous, metric areas square measure instances of a mathematical space, as any distance or metric characterizes a geographics. The misshapenness that square measure thought of in geographics square measure homeomorphisms and homotopy.
Jayson Freddie Cooper
Direct variable based math is a field of arithmetic that is all around consented to be an essential to a more profound comprehension of AI.
Albeit direct variable based math is an enormous field with numerous elusive hypotheses and discoveries, the stray pieces devices and documentations taken from the field are functional for AI experts. With a strong establishment of what straight variable based math is, it is conceivable to zero in on the great or important parts.
In this instructional exercise, you will find what precisely direct variable based math is from an AI point of view.
Saoussan Kallel-Jallouli
Time is not a simple subject. A complete description of time remained an unsolved problem for many decades. In this work, we propose the hypothetical existence of invisible unusual stuff, named “Zaman”, responsible for the variations of time. We explain how this new geometrical model of Zaman creates a kind of strong lensing effect: We can get two images, three images, or more, of the same object. We then conclude how Zaman offers a good solution to the Dark Matter problem. Moreover, this unusual stuff can explain the superposition principle of quantum mechanics. This new vision of time helps us understand better our physical universe, on tiny scales and on large scales.
Mohamed Daris
In carrying out this work we have the great uncertainty of finding quantum solutions of digital functions. this work is done for the design of the quantum solutions of the numerical functions so that the reading of the digital numbers is more precise and detaches the numerical number of its own mathematical cellar and to release it in a quantum cube which will be to find equations much more complicated that thanks to digitized graphics to be more precise in our work. So we made the quantum marker based on these equations and on other demonstrations so that it is ready to use in digital functions.
Jehonathan Bentovish*
Twenty-first Theoretical Physics is undergoing a major "Paradigmatic-Shift" from the Old "Material-Causal" Paradigm to the New "A-Causal Computation" of the "Computational Unified Field Theory" (CUFT), also recently termed: "G-d's Physics" Paradigm! Contrary to the "Material-Causal" Paradigm which assumes that every physical subatomic or relativistic phenomenon can be explained merely through direct physical interactions between a subatomic "probe" and "target" (probability wave function) elements or between massive objects and their curvature of Space-Time (as described by Einstein's Relativistic Equations); The New "G-d's Physics" Paradigm negates the possibility of any such "Material-Causal" physical relationships due to the simultaneity of a singular higher-ordered "Universal Computational/Consciousness Principle's" (UCP) computation of all exhaustive spatial pixels in the universe for each consecutive "Universal Frames" comprising the entire physical cosmos at every minimal "time-point", e.g., computed at the rate of "c2/h" = 1.36-50 sec'! This New "A-Causal Computation" Paradigm for 21st Century Theoretical Physics opens a plethora of new fascinating possibilities such as the unification of Relativity Theory and Quantum Mechanics, a complete unification of the four basic physical features of "space" and "energy", "mass" and "time" as secondary integral computational by-products of the UCP's singular operation, the possibility of transcending the strict relativistic "Speed of Light Barrier" for transmission of any signal or physical effect across space and time, "time-reversal" and even the "abolishment of death"! G-d's Physics New Paradigm has also been shown to challenge and revise the basic "Material-Causal" assumption underlying the "Big-Bang" Model, Einstein's Equations, the Second Law of Thermodynamics, discard the purely hypothetical concepts of "dark-matter" and "dark-energy" as "superfluous" (non-existent) and more generally also of Darwin's Evolutionary Natural Selection Principle! This article unifies the Four basic Forces known in Nature as a manifestation of the New "G-d's Physics" Paradigm's evolution of this singular Universal Consciousness Principle's full manifestation – from the limited "Material-Causal" assumption towards its fully expansive and integrated "Perfected" – Morally, Spiritually and even Physically "Geula" (Redemption) State of Humanity and the Universe!
Jehonathan Bentovish*
Twenty-first Theoretical Physics is undergoing a profound "Paradigmatic-Shift" from the Old "Material-Causal" Paradigm underlying both Relativity Theory (RT) and Quantum Mechanics (QM) to the New "G-d's Physics" ("Computational Unified Field Theory", CUFT) Paradigm; The New 'G-d's Physics' Paradigm has been shown capable of resolving the apparent "theoretical inconsistency" between Relativity Theory (RT) and Quantum Mechanics (QM) and offer an alternative theoretical explanation for the accelerated expansion of the physical universe (which cannot be explained by the purely hypothetical and undetectable "dark-matter" and "darkenergy" concepts); Based on the discovery of a singular higher-ordered "Universal Computational/Consciousness Principle" (UCP) which simultaneously computes every exhaustive spatial pixel in the universe at the incredible rate of "c2/h" = 1.36-50 sec'(!) this New 21st Century Physics Paradigm challenged and revised the basic "Material-Causal" assumption of the Old 20th century Old RT & QM Models: challenging the "Big-Bang" Model, Einstein's Equations, QM's assumed "collapse" of the target's probability wave function (following its assumed direct physical interaction with the corresponding subatomic probe element) etc. Instead, due to the New 'G-d's Physics' additional "Computational Invariance Principle" and "Universal Consciousness Reality" theoretical postulates indicating that only the "computationally invariant" UCP may be considered as "constant" and "real", whereas the "computationally variant" four basic physical features of "space", "energy", "mass" and "time" may only be regarded as "phenomenal" and "transient" – i.e., existing only "during" each consecutive "Universal Frame" (UF) being solely computed by the singular UCP, but "dissolving" back into this singular UCP "in-between" any two consecutive UF's frames; forces us to regard the whole existence of the entire physical universe (and its associated apparent "physical laws") as existing only "transiently" and "phenomenally" – as "manifestations" of this singular higher-ordered "Universal Computational/Consciousness Reality" (UCR)!
Alberto Miró Morán*
The cosmological constant problem or vacuum catastrophe is localized in the convergence between general relativity and quantum field theory, it is considered as a fundamental problem in modern physics. In this paper we describe a different point of view of this problem. We discuss problem could depend to different definition of the vacuum energy density.
Jehonathan Bentovish*
Twenty-first Century Physics is undergoing a Major "Paradigmatic-Shift" from the Old "Material-Causal" Paradigm to the New "A-Causal Computation" (ACC) Paradigm associated with the "Computational Unified Field Theory" (CUFT), also called: "G-d's Physics". Based on the CUFT's discovery of a new singular higherordered "Universal Computational/Consciousness Principle" (UCP) which simultaneously computes every exhaustive spatial pixel in the universe for each consecutive "Universal Frame" (UF) at the incredibly rapid rate of "c2/h" = 1.36-50 (sec')! this New 'A-Causal Computation' Paradigm of 21st Century Theoretical Physics dramatically revises some of the basic assumptions underlying the Old "Material-Causal" Paradigm: It challenged and revised the "Big-Bang" Model, Einstein's Equations, and discarded "dark-matter" and "dark-energy" as being "superfluous" (i.e., non-existent!) due to the UCP's simultaneous computation of all exhaustive spatial pixels for each consecutive UF's frame/s! Based on another "Computational Invariance Principle" postulate of this New 21st Century 'G-d's Physics' Paradigm asserting that only the UCP may be regarded as "computationally-invariant", whereas the four basic physical features of "space", "energy", "mass" and "time" can only be considered as "transient", "phenomenal" ("computationally-variant") manifestations of this singular higher-ordered UCP; it is derived that the entire physical universe is governed solely based on the characteristics of this singular higher-ordered "Universal Consciousness Reality" (UCP)-one of which is "Goodwill", e.g., the propensity to sustain all Life, to bestow "Goodness" and evolve the entire universe towards an ultimate Perfected State (Physically, Morally and Spiritually), called "Geula" in Jewish Tradition.
Bentovish Jehonathan*
Twenty-first century Theoretical Physics is undergoing a major "Paradigmatic-Shift" from the Old "Material-Causal" Paradigm of Relativity Theory (RT) & Quantum Mechanics (QM) to the New "G-d's Physics" Paradigm; This New 'G-d's Physics' (Computational Unified Field Theory, CUFT) Paradigm has been published in over thirty peer-reviewed articles and Books and has been shown capable of resolving the apparent "theoretical inconsistency" between RT & QM; It also discovering the existence of a singular higher-ordered "Universal Computational/Consciousness Principle" (UCP) that simultaneously computes every exhaustive spatial pixel in the universe at the incredible rate of "c2/h" = 1.36-50 sec'(!) giving rise to an incredibly rapid series of "Universal Frames" (UF's) comprising the entire physical universe (which "dissolves" back into the singularity of the UCP 'in-between' any two consecutive UF's frames!) This New 'G-d's Physics' Paradigm is also shown to challenge and negate key RT & QM 'Material-Causal' assumptions such as: the 'Big-Bang; Model, Einstein's Equations and the "collapse" of the probability wave function. This article evinces that the New 'G-d's Physics' Paradigm similarly negates Darwin's Natural Selection Principle (DNSP) dual 'MaterialCausal' assumptions due to the simultaneity of the UCP's computation of all exhaustive spatial pixels for each consecutive UF's frame! Instead, the New 'G-d's Physics' Paradigm points at the UCP's singular, higher-ordered characteristics of the "Ten Laws of Hierarchical Manifestation" including: "Free-Will", "Good-Will", the "Dynamic-Equilibrium" Moral Principle and its selection of one of multiple "possible future/s" based on the Moral Choice/s of each Individual Human Consciousness (at any point in time); and the UCP's "Reversed Time Goal Hypothesis" which completely integrates between all Physical, Biological and Moral developments-pointing at the UCP's Ultimate "Geula" (e.g., Physical, Moral & Spiritual "Perfected State") Goal of the universe, towards which this singular "Universal Consciousness Reality" continuously advances the entire physical universe and Humanity!
Jehonathan Bentovish
Paradigm associated with the "Computational Unified Field Theory" (CUFT), also called: "G-d's Physics". Based on the CUFT's discovery of a new singular higherordered "Universal Computational/Consciousness Principle" (UCP) which simultaneously computes every exhaustive spatial pixel in the universe for each consecutive "Universal Frame" (UF) at the incredibly rapid rate of "c2/h" = 1.36-50 (sec')! this New 'A-Causal Computation' Paradigm of 21st Century Theoretical Physics dramatically revises some of the basic assumptions underlying the Old "Material-Causal" Paradigm: It challenged and revised the "Big-Bang" Model, Einstein's Equations, and discarded "dark-matter" and "dark-energy" as being "superfluous" (i.e., non-existent!) due to the UCP's simultaneous computation of all exhaustive spatial pixels for each consecutive UF's frame/s! Based on another "Computational Invariance Principle" postulate of this New 21st Century 'G-d's Physics' Paradigm asserting that only the UCP may be regarded as "computationally-invariant", whereas the four basic physical features of "space", "energy", "mass" and "time" can only be considered as "transient", "phenomenal" ("computationally-variant") manifestations of this singular higher-ordered UCP; it is derived that the entire physical universe is governed solely based on the characteristics of this singular higher-ordered "Universal Consciousness Reality" (UCP)-one of which is "Goodwill", e.g., the propensity to sustain all Life, to bestow "Goodness" and evolve the entire universe towards an ultimate Perfected State (Physically, Morally and Spiritually), called "Geula" in Jewish Tradition.
Jehonathan Bentovish
Twenty-first Century Theoretical Physicsis undergoing a "Paradigmatic-Shift" (akin to Einstein's Relativity Theory (RT) "Paradigmatic Shift" over 100 years ago) from the Old "Material-Causal" Paradigm underlying both RT and Quantum Mechanics (QM) to the New "Computational Unified Field Theory" (CUFT), recently called" "G-d's Physics" New Paradigm! This New "G-d's Physics" Paradigmatic-Shift was called for due to the apparent "theoretical inconsistency" between the two "pillars" of the Old "Material-Causal" Paradigm, i.e., RT & QM, as well as the inability of the Old Paradigm to account for up to 95% of all the mass and energy concepts that could not be verified empirically; The New "G-d's Physics" Paradigm is based on the discovery of a singular, higher-ordered "Universal Computational Principle" (UCP) which simultaneously computes all exhaustive spatial pixels in the universe for each consecutive "Universal Frame/s" (UF) at the unfathomable rate of "c2/h"=1.36-50 (sec'), with the entire universe "dissolving" into the UCP's singularity "in-between" any two consecutive UF's frames! This New "G-d's Physics" A-Causal Computation Paradigm was shown to challenge, negate and revise some of the most fundamental assumptions of the Old "MaterialCausal" Paradigm such as: the "Big-Bang" Model, Einstein's Equations, discarding of "dark-matter" and "dark-energy" as "superfluous" (non-existent!) and QM's "material-causal" assumption regarding the "collapse" of the probability wave function due to its assumed direct physical interaction with another subatomic "probe" element. A few key new "Theoretical Postulates" of the New "G-d's Physics" Paradigm include: the "Universal Computational Formula" shown capable of unifying the Four basic Forces, as well as the four basic physical features of "space", "energy", "mass" and "time", and integrate all key relativistic and quantum laws and relationships. Nevertheless, "G-d's Physics" New "A-Causal Computation" Paradigm points at the complete dependency of the UCP's simultaneous computation of all exhaustive spatial pixels in the universe upon the UCP's "Ten Laws of Hierarchical Manifestation" which include: its Three "Computational Dimensions", "Supra Spatial-Temporal Reservoir", "Dynamic-Equilibrium Moral Principle", "Expansion of Consciousness (Hypothesis)", Collective Human Consciousness Focus (Hypothesis)", "Reversed-Time Goal Hypothesis", "Good-Will" & "Free-Will" characterizations of the UCP; Therefore, the New "G-d's Physics" Paradigm of 21st Century Physics portrays an entirely "New Universe" – which completely depends on the UCP's continuous computation- "dissolution" sustenance- and evolutionof all exhaustive spatial pixels comprising the universe towards the "Ultimate Goal" o a "Perfected: Physically, Morally and Spiritually" State of the World (and Humanity) which realizesthe Singularity, "Unity" and "Goodness" of this "Universal Consciousness Reality", called: "Geula" in Jewish Tradition!
Jehonathan Bentovish
Twenty-first Century Theoretical Physics is in a state of a "Paradigmatic-Crisis" wherein the two pillars of the Old "Material-Causal" Paradigm (Relativity Theory & Quantum Mechanics) seem "inconsistent" and up to 95% of all the energy and mass in the universe cannot be directly accounted for, e.g., deemed as "DarkEnergy" and "Dark-Matter"?! Indeed, the quest for a satisfactory answer for these two primary conundrums: resolving the basic RT-QM inconsistency and explaining what precisely comprises these (purely hypothetical) "Dark-Matter", "Dark-Energy" concepts are most likely the most important topics – to be resolved by any (satisfactory) 21st Century Physics' "New Paradigm"! Fortunately, such a New "Computational Unified Field Theory" (CUFT), lately termed: "G-d's Physics" Paradigm provides satisfactory answers for these two critical fundamental issues; "G-d's Physics" was shown to resolve the apparent RT-QM "theoretical inconsistency" based on the discovery of a new singular higher-ordered "Universal Computational/Consciousness Principle" (UCP) which simultaneously computes every exhaustive spatial pixel in the universe at the incredible rate of "c2/h" = 1.36-50 sec', creating an extremely rapid series of "Universal Frames" (UF's) comprising the entire physical universe at every "minimal time-point"! Specifically, this New "G-d's Physics" Paradigm explains the fundamental nature of "Dark- Matter" and "Dark-Energy" in a completely new manner – i.e., as representing the operation of the singular higher-ordered UCP's "A-Causal Computation"! Hence, instead of "Dark-Matter" and "DarkMatter" comprising actual "material" (hypothetical) elements, the New 21st Century "G-d's Physics" Paradigm explains these constructs as merely a manifestation of the singular higher-ordered UCP's accelerated addition of exhaustive spatial pixels for each consecutive UF's frame/s. Viewed in this new light, the real "cause" for the universe's accelerated expansion – is not found in any "material" or "physical" entities, but is rather contingent upon the successive simultaneous computation- "dissolution" re-computation and evolution of each exhaustive spatial pixel in the universe at the incredible rate of "c2/h" = 1.36-50 sec'!
Himanshu Chawla*
Choices to general relativity are actual speculations that endeavor to depict the marvel of attractive energy in rivalry to Einstein's hypothesis of general relativity. There have been various efforts to build an ideal hypothesis of gravity. These endeavors can be parted into four general classifications dependent on their extension. In this article, we examine direct choices to general relativity, which don't include quantum mechanics or power unification. Different speculations which do endeavor to develop a hypothesis utilizing the standards of quantum mechanics are known as hypotheses of quantized gravity. At long last, the most driven hypotheses endeavor to both put gravity in quantum mechanical terms and bring together powers; these are called speculations of everything. None of these choices to general relativity have acquired wide acknowledgment. General relativity has withstood numerous tests, staying steady with all perceptions up until this point.
Himanshu Chawla*
Albert Einstein introduced the hypotheses of extraordinary relativity and general relativity in distributions that either contained no proper references to past writing, or alluded uniquely to few his archetypes for principal results on which he based his speculations, most eminently to crafted by Henri Poincaré and Hendrik Lorentz for exceptional relativity, and to crafted by David Hilbert, Carl F. Gauss, Bernhard Riemann, and Ernst Mach for general relativity. Hence, claims have been advanced about the two speculations, attesting that they were defined, either completely or to a limited extent, by others before Einstein. At issue is the degree to which Einstein and different others ought to be credited for the plan of these hypotheses, in light of need contemplations.
Jayson Freddie Cooper*
New instructive guidelines execution focuses on the projective start of preparing in school instruction. Along these lines, thought of instructive action just as the interaction of getting prepared information ought to be deserted. Along these lines the significance of the examined issue is validated by the need to foster deliberate works associated with the presentation of between subject tasks into science instructors' educational action as arithmetic has a wide application in different sciences, however, at exercises, it is abandoned because of time limits and deficient numerical mechanical assembly school understudies have. All that said determines the objective of the paper: to characterize chances of venture based action application in incorporation of numerical and normal science disciplines and improvement of deliberate suggestions on its expansive application over the span of preparing in the subject.
Carson Lam Kai Shun*
Cancer has a long-time history in our human health experience. Practically, one-fifth of the disease was caused by virus infection. Thus, it is important for us to understand the virus-cancer infection mechanism. Statistically, we may perform the necessary causality regression analysis in such situation to build up the corresponding model just like my previous case in influenza-weather infection. In the present research, I will interact the systems of differential equations (Lorenz System) with my HKLam theory and figure out the recursive result that we may get. Then we may go ahead for the corresponding policy generated from the dynamic programming that can solve the Markov Decision Process. In addition, we may apply the HKLam theory to the chaotic time series and compare the model with machine learning one for a better selection while otherwise failure is explained. Finally, I will also discuss a novel mathematical method in determining the saddle point in Lorenz attractor together with the gradient descent. The aim is to find the equilibrium for the Lorenz attractor with given initial conditions. It is hope that the mathematics-statistics interaction together with the causal regression (Artificial Intelligence) model may finally help us fight against those diseases like virus-infected cancer or others.
Sudipta Basu*
In this article it is tried to work out on the mathematics of stochastic version of Von-Bertalanffy power law model to find an explicit solution and the MLEs of the model parameters are also worked out.Then this model is applied to the COVID infection data(First wave data) of South Korea after observing the nature of growth and some comparisons are made with stochastic Gompertz model and stochastic Logistic model in this context and the stochastic Von-Bertalanffy power law model performs better than the other two atleast here in this case. It is observed that most of the infected persons by this virus has experienced a mild to moderate respiratory problems and many of them has recovered under normal treatments, but for aged persons who were suffering from CVD (cardio vascular disease), severe respiratory problems, diabetes, cancer etc. i.e, persons with co-morbidity has faced serious trouble after getting infected(COVID positive)([13]).
Nhuan Doan
A perfect number is a positive integer that is equal to the sum of its positive divisors, excluding the number itself. Currently only even perfect numbers are to be known. It is not known if any odd perfect numbers exist or not. So, is there an odd perfect number? No, there is not. (See unit 4, …, 7)
The COVID-19 pandemic is a rolling crisis that required policymakers around the world to take action to manage the pandemic spread. During the period where vaccination was not available, the pandemic spread management lied in Non-Pharmaceutical Interventions (NPIs). A large number of mathematical models have been proposed to predict the pandemic spread and analyze the influence of various NPIs on multiple scenarios, getting more accurate and sophisticated over time. In this work, we review the latest mathematical models for pandemic spread NPIs and their main shortages. Specifically, the usage of modeling a homogeneous population rather than a heterogeneous one.
Bermejo Vicente*, Pilar Ester and Isabel Morales
PEIM (Programa Evolutivo Instruccional Para Matemáticas) is a constructivist intervention program intended for improving students’ mathematical skills. It is based on four fundamental pillars: Students are the protagonists of the classroom and construct their own mathematical knowledge. The teacher guides and facilitates students’ mathematical knowledge construction. He is a great connoisseur and respects the students’ process followed to learn each of the most significant school mathematical contents i.e., strategies, errors, etc. The use of individual mathematical profiles turns out to be very useful. Mathematical contents should be selected and sequenced according to their difficulty, avoiding, as much as possible, the rote and mechanical tasks and always looking for meaningful situations for students. Finally, the classroom climate must be constructivist, highlighting each student’s cooperative work and independence in the construction of their own knowledge and using materials that facilitate their learning process. The implementation of PEIM highlights its effectiveness not only in solving problems, but also in choosing more advanced strategies to solve problems and notoriously reducing the conceptual errors. It should also be noted that students’ mathematical performance appears in relation with the teacher’s level of acceptance and application of the constructivist approach in the classroom.
Bermejo Vicente*, Pilar Ester and Isabel Morales
In the present article, we analyze the characteristics of bilingual education related to learning mathematics in the early years. When language immersion is implemented while teaching a specific subject such as mathematics, we differentiate between the students’ language of instruction and the student’s mother tongue. This article delves into the verbal aspect of mathematics and how the language of instruction influences tasks in which the verbal component is essential, such as problem solving. The results shown in the present investigation compares the influence of the language of instruction in 1st and 2nd grade students whose language of instruction coincides with their mother tongue compared to those groups whose language is not coincident. The results show a greater effectiveness in the task executed by the students whose language of instruction coincides with the mother tongue. However, we also observe how this distance tends to be shortened in 2nd grade groups.
Kisakye Irumba
Numerical models are a valuable apparatus for researching an enormous number of inquiries in digestion, hereditary qualities, and quality climate communications. A model dependent on the fundamental science and organic chemistry is a stage for in silico organic experimentation that can uncover the causal chain of occasions that associate variety in one amount to variety in another [1]. We talk about how we build such models, how we have utilized them to research homeostatic systems, quality climate associations, and genotype-aggregate planning, and how they can be utilized in accuracy and customized medication.
Musa Ekon
Polynomial math at the further developed level is regularly depicted as current or dynamic polynomial math. Truth be told, both of these depictions are mostly deceptive. A portion of the extraordinary disclosures in the upper compasses of present-day polynomial math for instance, the alleged Galois hypothesis were known numerous years prior to the American Civil War; and the wide points of variable based math today were obviously expressed by Leibniz in the seventeenth century [1]. Consequently, "current" variable based math isn't so exceptionally present day, all things considered how much is it conceptual All things considered, deliberation is all relative; one individual's deliberation is someone else's meat and potatoes. The theoretical propensity in math is similar to the circumstance of changing good codes, or changing preferences for music: What shocks one age turns into the standard in the following.
Anirban Goswami, Sudipta Basu, Proloy Banerjee and Shreya Bhunia
In this article it is tried to construct a stochastic model which looks a generalized stochastic version of Von Bertalanffy power law model and Richard’s model and one can use to describe biological growth phenomena according to the appropriate situation and suitability of this model. It is mainly constructed to explain growth dynamics of patients infected by COVID-19 in South Korea. Here it is attempted to find the expression of variable of interest at time t and also the MLE of growth rate parameter is worked out. This model is applied to a real life data of infected patients by COVID-19 in South Korea after observing the growth pattern. This model could be used to the data sets of other countries, where no lockdown was imposed as a precautionary measure to deal with this situation. Then a comparative study is made between some well-known models and special cases of the model, described here. It is found that the special cases of the model that is described in this article fits better to the data than others.
Bhagavathula Venkata Narayana Murthy*
In this paper, the concepts of equiprime ideals and equiprime semi modules over Boolean like semi rings are introduced and also furnish some examples. It is proved that if P is equip rime R- Ideal of the R-module M then (P:M) is euqiprime ideal of R. Also establish if M is an equiprime R-module then (0:M) is an equiprime ideal of R. Also establish an important characterization of the equiprime ideal of R in the semi module structure over R. Using this result we can prove, If R is a Boolean like semi ring and R ≠ 0. Then R is equiprime if and only if there exists a faithful equiprime R-module M. Finally, we show that let M be an equiprime R-module and let N be an invariant sub group of R such that N is not contained in (0:M). Then M is an equiprime N-module.
DOI: 10.37421/2090-0902.22.13.355
DOI: 10.37421/2090-0902.22.13.351
DOI: 10.37421/2090-0902.22.13.354
DOI: 10.37421/2090-0902.22.13.352
DOI: 10.37421/2090-0902.22.13.353
DOI: 10.37421/2090-0902.22.13.360
DOI: 10.37421/2090-0902.22.13.359
DOI: 10.37421/2090-0902.22.13.358
DOI: 10.37421/2090-0902.22.13.357
Daniel Larry
DOI: 10.37421/2090-0902.22.13.356
Daris Wang*
DOI: 10.37421/2090-0902.2022.13.363
Michel Cope*
DOI: 10.37421/2090-0902.2022.13.362
Batur Gilani*
DOI: 10.37421/2090-0902.2022.13.364
Eddy A
DOI: 10.37421/2090-0902.2022.13.365
Jenni Freddie
DOI: 10.37421/2090-0902.2022.13.366
DOI: 10.37421/2090-0902.2024.15.461
Counting independent sets is a fundamental problem in combinatorics with applications in various fields such as computer science, graph theory and statistical physics. In this article, we delve into the counting rules for computing the number of independent sets in graphs. We explore different techniques and algorithms employed to efficiently determine the count of independent sets, discussing their applications and implications in diverse domains.
DOI: 10.37421/2090-0902.2024.15.462
Metaheuristic algorithms have emerged as powerful tools for solving optimization problems across various domains. These algorithms offer innovative approaches to finding high-quality solutions, often outperforming traditional optimization techniques. In this article, we delve into the realm of metaheuristic algorithms, exploring their principles, applications and comparative advantages. We discuss several prominent metaheuristic algorithms, including genetic algorithms, simulated annealing, particle swarm optimization and ant colony optimization. By understanding these algorithms' underlying mechanisms and characteristics, practitioners can effectively apply them to tackle complex optimization challenges.
DOI: 10.37421/2090-0902.2024.15.463
DOI: 10.37421/2090-0902.2024.15.464
Nanofluids, colloidal suspensions of nanoparticles in base fluids, exhibit fascinating thermophysical properties that have garnered significant attention in various fields, particularly in thermal engineering and nanotechnology. Accurate prediction of these properties is crucial for their effective utilization in applications such as heat transfer enhancement, cooling systems and advanced manufacturing processes. Traditional methods for predicting nanofluids properties often face challenges due to the complex interactions between nanoparticles and base fluids. In recent years, artificial intelligence (AI) techniques have emerged as promising tools for predicting the thermophysical properties of nanofluids. This article provides a comprehensive review of the application of AI methods, including machine learning and deep learning, in predicting the thermophysical properties of nanofluids. The review explores various AI algorithms, data sources and modelling approaches employed in this domain, highlighting their advantages, limitations and future prospects.
DOI: 10.37421/2090-0902.2024.15.465
The Monge–Ampère equation, stemming from classical geometry, has found significant applications across various fields, including differential geometry, optimal transportation and even image processing. The solvability criterion for systems arising from Monge–Ampère equations plays a pivotal role in understanding the existence and uniqueness of solutions to these equations. In this article, we delve into the mathematical intricacies of this criterion, exploring its theoretical foundations and practical implications. We begin by providing an overview of the Monge–Ampère equation and its significance. Subsequently, we delve into the formulation of systems derived from this equation and elucidate the solvability criterion, discussing its mathematical underpinnings and implications in diverse contexts. Through concrete examples and applications, we illustrate the relevance and utility of this criterion in various mathematical and applied domains.
DOI: 10.37421/2090-0902.2024.15.466
DOI: 10.37421/2090-0902.2024.15.467
DOI: 10.37421/2090-0902.2024.15.468
DOI: 10.37421/2090-0902.2024.15.469
DOI: 10.37421/2090-0902.2024.15.470
Physical Mathematics received 686 citations as per Google Scholar report