Arbitrary quantization vs Stable math constructs
author: m concoyle
email: martinconcoyle@hotmail.com
or Inequality vs. equality
Science and math now (2012) have a very slight relation to practical development.
Though this is correct, the propaganda system says that it is incorrect. Unfortunately, people believe the propaganda.
Nonetheless.
Does language have any meaning if it is not addressing issues concerning "providing information about existence" so as to facilitate creativity in a context of practicality?

What is the purpose of professional math constructs?
Are they about math properties to be used for greater ranges of creative possibilities?
or
Are they math properties to be used to quantize distinguishable properties of the world, so as to create the appearance that certain distinguishable properties are related to math patterns in a measurably verifiable manner, so as to turn arbitrary contexts into scientific truths, where the quantization is to be based on indefinable randomness.
There are two different types of precise descriptions used to describe the underlying patterns which are believed to be involved in the natural composition of a physical system, where the point is to try to explain how observed measured properties are caused, either by "some cause," or [are observed properties resulting from "random events which simply happen"] due to the randomness of material existence.
The two different types of descriptive languages used in physical science are based on,
(1) geometry (classical physics), or
(2) randomness (quantum physics).
(1) There is the geometrically based descriptive structure of classical physics, wherein there is both material geometry and its measurable properties can be probed by local, linear measuring structures (derivatives) defined on functions (which represent physically measurable values), which, in turn, are contained in metricinvariant metricspaces (both Euclidean space, and spacetime space), so that these local measuring properties can be used to define the differential equations of general (or a wide range of) material systems based on defining inertia, m, in the formula, F=ma, in a geometric context of forcefields, F, which have a local vector structure (at the point of measurement, ie where the differential equation is defined), as do "the local changes of motion," which are represented as a. Almost always, nonlinear classical differential equations for physical systems, provide a limit cycle structure found from the differential equation's critical points.
Virtually all of our current technical development is based on classical physics, since it is a geometrically based description, and the solvable aspects of this description are very controllable.
When these linear differential equations are separable (parallelizable and orthogonal, both locally and globally) then they are solvable, so as to give information (associated to measurable properties) in the form of a (global) solution function, in a (very useful) geometric context. Control of nonlinear system by means of feedback can be affected through the information that limit cycles can provide about the physical system.
This technique of defining a general system's differential equation, applies to general systems over a wide range of contexts.
These techniques can be applied in a practically useful context to physical systems which are:
1. gravitating (better than does the information for systems described by general relativity [where such information about a physical system, with more than one body, does not really exist]),
2. electromagnetic,
3. mechanical,
4. thermal, and
5. statistical systems.
(2)
While (on the other hand):
1. statistical,
2. nuclear,
3. atomic,
4. molecular, and
5. crystalline systems are all quantum systems,
ie statistical systems can be described in either context (classical or quantum).
The observed discrete, and preciselystable energy structures of: nuclei, atoms, molecules, and crystals [as well as other geometrically bound material systems] are quantum systems...
(which have precise discrete quantum properties)
... , whose descriptions are assumed to be based on an indefinably random math construct, a construct which is modeled by function spaces, upon which sets of operators act so as to diagonalize the system's linear, energy operator (or waveoperator) [defined on a function space (which has the structure of an infinitedimensional vector space)] so as to try to identify the system's precise, discrete energy states as the eigenvalues of the function space's (diagonalized) spectral functions.
For quantum (discrete) systems in which the (assumed) particle components occupy the different discrete energy levels of quantum systems, the statistics will be based on the statistical properties of either Bose or Fermion (particle) components, where these two types of (stateoccupying) components (or particles) are distinguished by their internalsymmetry spinstates, where Fermions only allow one component to occupy a discrete energylevel at a time, on the other hand the Bosons allow any number of particles to occupy a discrete energylevel at a time.
The energy of occupancy depends on the system's groundstate, and its discrete energy levels, as well as the energy of the system within which the components are contained, eg a semiconductor placed within an electric circuit [where the energy depends on the energy of the classical electric circuit too], or the system's energy depends on the spectraltype of light which is present in a system.
If quantum properties can be coupledto a linear, solvable classical system then these quantum properties can be used (or controlled) based on the classical control one has over the classical system and its capacity to change the discrete energy structure of the quantum component.
However, the stable discrete precise energy structures of quantum systems "characterized by the number of their components" implies that these quantum systems are forming in a controlled, causal (and geometric) context for their interactions.
Note: Actionatadistance is a property of both gravitating systems... .,
namely, the Newtonian models of gravitation systems, which are the only useful descriptions of gravitating systems, eg general relativity has no capacity to describe the stable properties of the solar system,
... ., and quantum systems.
Actionatadistance is also called the property of nonlocalness (usually this is done in the context of quantum physics).
It is the 1/r singularities of a quantum system's particlecharge components which cannot be handled in a consistent manner by mathematics [the math diverges but the systems are stable] so that nuclei, general atoms (ie atoms composed of five or more charged components), and molecules all seem to have no valid, precise, descriptive structure emerging from the laws of quantum physics, but rather there are usually only vague approximations for these fundamental quantum systems based on the property of spherical symmetry (and an associated property of conservation of angular momentum) which in most of these systems does not exist, since there are too many components makingup the system for spherical symmetry to exist, eg an atom composed of a nucleus and four (other) electrons has five pointcharge singularity points (has no spherical symmetry).
The unitary symmetry of an energy invariant quantum system is extrapolated into a descriptive realm where there is no local geometry (due to both singularities as well as the uncertainty principle), yet the finitedimensional unitary symmetry of identifiable particlecomponents (or elementaryparticles) are determined in an assumed context of the local geometry of pointparticlecollisions (ie a logical inconsistency), where many of these particles are charges, and thus define a 1/r singularity.
In this description, based on waveequations, material is associated to constants (coupling constants such as charge, q) and patterns of (fieldparticlecollision) interactions are associated to Boseparticleunitary matrices (which usually only represent energy, but usually not mass).
This model (of particlecollision materialinteractions) is derived from wavephase invariance, which is supposedly a "local" invariance [ie it is assumed that the wavephase of a global wavefunction can be changed locally], which is used to determine the allowable types of waveequations, ie the types of unitary particlestate changes involved in a bosonmaterial particlecollision, which represent physical laws, where it is required that the equations be gaugeinvariant, which means that the equation which represents a physical law is invariant to certain types of changes to the field representation, ie changes of particlestate, which the solution to the equation defines.
Photons (associated to charge constants) have a scalar wavephase {as well as a global phaseinvariance} but (all) other interaction particles (except for scalar Bosons) have local phaseinvariance, as well as internal (symmetric, or gaugeinvariant) particlestates [as spin is modeled as a particle's internal symmetry state].
However, relating such particlecollision models of material interactions to the observed stable properties of fundamental quantum systems, such as nuclei, is a relation which does not exist.
It is claimed that these particlecollision interactions only adjust (by small perturbations) the wavefunctions of quantum systems, [but for general quantum systems these wavefunctions "cannot be found"]. Nonetheless, due to singularities in the math structures... , especially for an assumed spherical symmetry as a basis for describing physical interactions which implies a 1/r singularity for pointparticle models of charged material... , the calculations of these perturbations diverge.
Thus, the (so called) perturbations, wherein the particlecollision field adjustments to the wavefunction cause the wavefunction to diverge, and thus, it is believed {Why is this believed?} these perturbation calculations need to be readjusted, or renormalized.
Why?
It is a failed theory, yet it is the only set of ideas which are allowed in professional physics journals.
This supposed (renormalized) adjustment to a wavefunction is (claimed to be) needed due to the existence of singularities, and an assumed disintegration of local geometry which charged pointparticles cause near such pointparticles (as well as causing the disintegration of local harmonic averages of the wavefunctions of the quantum systems), as well as the disintegration of local spatial properties due to the uncertainty principle, when a particular point in space is specifically identified, eg the point where the pointparticles collide.
In the process (math model) of renormalization, there is assumed to exist a chaotic structure of space which contains pointparticle charges, and this is supposed to require that calculations (supposedly associated to physical laws) need to be readjusted in a rather major adjustment of the calculated quantities (or the quantitative structures) called renormalization, wherein the chaotic structure of spaces is caused by the pointparticle model of charges which possess a 1/r singularity due to spherical symmetry material interaction structure, and the point of the particle causes changes in the pointparticle's inertia due to the uncertainty principle for a descriptive containment structure based on randomness.
Do quantum systems have global wavefunctions which possess definitive wavephase structure, or not?
How does geometry get rebuilt from a local chaotic structure of space?
Comment:
Perhaps the local wavephaseinvariance is not consistent with a global (or geometric) math structure.
Thus, the idea of "local wavephaseinvariance" is better related to a change of dimensionallevels as a part of the descriptive structure.
The geometries of the separate dimensional levels would be:
1. discontinuous, or
2. discrete, and
3. hidden due to discreteness of "the constructed interaction process" and a significant need for a "large size system" involved "in a significant, or detectable, interaction" due to the physical constants, ie the constants defined between dimensional levels, as well as between different subspaces of the same dimension, where these (physical) constants determine the relation of the magnitude that physical parameters have to the strength of a material interaction.
The quantum description of stable wavefunctions and their assumed particlecollision adjustments is a practically useless descriptive structure with endless fundamental paradoxes, emanating from singularities, as well as from spaceandinertial structures of uncertainty, while the resolution of these paradoxes is done in ways which are used to fit the description to observed data, ie they are epicycle structures in an inconsistent math structure.
This leads one to consider "what the purpose of math constructs should be?" (see below)
How to describe existence in a measurable, controllable, and practically useful manner? It would have to be a description based on geometry, (not based on probability).
Does language have any meaning if it is not addressing issues concerning "providing information about existence" to facilitate creativity in a context of practicality.
However, descriptive languages do not have to be based on the idea of materialism.
So what can be an alternative way in which to organize math patterns so as to describe the observed properties of the material world?
A geometric world without materialism
An alternative to the local symmetry properties of discrete particlecollision models of material interactions are macroscopic discrete geometric constructs defined within both a manydimensional and size hierarchy of closely associated (or similar) discrete geometries, where the changes in size scales between dimensional levels would be determined by the values of physical constants, or constant factors defined between the different dimensional levels (as well as between different subspaces of the same dimension) which model:
metricspaces and
stable material systems, as well as
material interactions.
The material interactions are related to 2forms and Newton's laws
[thus, it is gauge invariant due to its extra potential term, (alpha), being exact, ie d(alpha) + A leading to, dA+dd(alpha)=dA, since the metricspace of the interaction is closed and simply connected, in regard to the material interactions it contains].
The basis of the new descriptive construct for set containment is to provide a math basis for stability, ie the discrete hyperbolic shapes, which define a dimensional construct upto hyperbolic (metricspace) dimension10, are very stable geometric shapes.
Note: A hyperbolic metricspace of 3dimensions is equivalent to a spacetime metricspace of 4dimensions.
The interaction structure is geometric, justlike classical physics is geometric, but is dependent on both discrete Euclidean shapes and on the spectral context of the entire containing highdimension space, where each dimensional level (each particular metricspace subspace of a given dimension) possess a stable discrete shape, and thus also possesses stable spectral properties. It is to the spectral properties of the overall containing space to which the stable material systems must be in resonance.
Furthermore, the apparent particleproperties of local interactions are associated to the distinguished points of discrete shapes, which can be both large and small shapes, and they can be of various dimensions.
The metricspace is associated to physical properties and thus it can also be associated to metricspace states eg opposite physical properties.
Material interactions depend on these metricspace states and this results in containment of this descriptive context by complex coordinates (so as to separate the opposite metricspace states into the real and pureimaginary subsets of the complexcoordinates), and this can lead to a unitary invariant descriptive construct. Where the unitary invariance results from the conservation of energy associated to the shapes and their interactions contained in any particular subspace of a particular dimensional level.
The geometry of discrete Euclidean and hyperbolic shapes is based on circles (and/or discs), as is complex geometry. Thus, there is consistency between discrete Euclidean and hyperbolic geometries and the geometries of complexcoordinates.
There is also many holes in this new geometric construct, but the hole structure depends on if the geometry is viewed within a metricspace, or if the metricspace, itself, is contained (or viewed) within a higherdimension metricspace. The property being measured determines what metricspace is defining the idea of containment.
The new geometric construct is not based on spheres, so it does not have 1/r singularities associated to its descriptive structures.
This new math construct can identify the stable properties which are observed for material systems, and it does this in a much more mathematically consistent manner, than is being done now by the professionals, where the current math constructs... , which are now used (by the professionals, 2012) to describe material properties... , have no math consistency associated to themselves.
The advantages of the new descriptive construct is that:
It is geometric, thus it can be practically useful,
It accounts for stable material systems,
It is mathematically consistent, thus it allows for deeper understanding of the math and physical constructs,
The various (so called) internal symmetries (of particlecollisions) are associated to the discrete geometries of the different dimensional levels (or different subspaces of the same dimension, which exist in a highdimension containing space but organized so as to not be noticeable, due to structure and physical constants [ie due to different relative sizes in regard to containment in a geometric construct]), and the apparent particleness of observed (collision) events, is about collisions centered at the distinguished points of the discrete shapes.
The construct of renormalization in particlephysics is eliminated, in the new descriptive construct, due to the fact that "pointparticle are not part of the new model," and due to the fact that the new description is "based (directly) on geometry," a geometric context in which the stability of materialsystems can be established (directly) within the context of material interactions. Thus, the uncertainty principle is no longer a part of the descriptive structure.
An important math property
Commutativity of (matrix) operators continuously defined at each point of a system's containing coordinate space allows for:
1. the set structure of an inverse (onetoone, and onto), as well as
2. a local linear structure, as well as
3. consistency with the metricfunction at each point in a metricinvariant space of nonpositive constant curvature, ie consistent with the Euclidean and hyperbolic metricspaces.
Such a math structure is a "parallel and orthogonal set of (global) coordinates," which if (this condition is) defined continuously on the global shape, then this means that a physical system's differential equation (which is representable locally as matrices) are solvable, and thus, controllable (by the system's boundary and initial conditions).
Commutativity implies linear maps (matrices) representing a physical system's differential equations... , acting on local vectors of a shape's natural coordinate system... , in a diagonal manner, ie the operator and the shape are consistent with one another.
One also wants the math property that "the matrix stays diagonal" as one maps the system's equations to any nearby coordinate point of the system's coordinates. This is about moving (in a continuous manner) the coordinates in a parallel direction along the coordinate directions, so that the equation stays the same, which means that the differential equation represents a linear approximation between the system's solution function and its domain coordinates along those same coordinate directions, ie the solution function depends on the (or has the same shape as the) coordinate functions ie a derivative defines the coordinate directions and they are consistent the geometry of the solution function's graph. Furthermore, because the differential equation has constant coefficients which are consistent with the constants of the metricfunction, ie the differential equation is (also) metricinvariant. Thus, the solution function's values are linearly consistent with the metricfunction's measures on the function's domain space.
These are the simple math properties which allow both stability and quantitative consistency and controllability, and they are geometric properties which define local linear quantitative relations between the coordinates and the differential equation's solution function values (where the function has the local structure of a vector), as well as defining a geometry which identifies a local linear quantitative relation between the coordinates and the metricfunction.
**
What is the purpose of professional math constructs?
Are they about math properties to be used for greater ranges of creative possibilities?
or
Are they math properties to be used to quantize distinguishable properties of the world, so as to create the appearance that certain distinguishable properties are related to math patterns in a measurably verifiable manner, where the quantization is to be based on indefinable randomness, as is quantum physics (also so based)?
In this context (of quantizing) it is believed that if the (local) properties of a [nonlinear] system's measured values can be approximated locally, then these local properties can be pieced together to define an approximate whole set of properties for the system, but the properties of the whole system will (might) be chaotic and the system unstable. That is, such a pieced together description is quantitatively inconsistent and is related only slightly to practical creativity.
Arbitrary quantization has a relation to propaganda
The propaganda system can use the quantization construct based on indefinable randomness.
The structure of language within the propaganda system can create a social context (of narrow beliefs) where whatever social context, characterized by some distinguishable property, which one (some business person) wants supported by measurable evidence, then counting the frequency of the (random) distinguishable event allows the identifiable property to be quantized (where the appearance of the property, or random event, can be counted, so that probabilities determined).
Thus determining both a measure of success (of the propaganda), and an illusion that such properties are being measurably verified, and thus they are relations which must be true (ie a scientific truth).
The probabilities can (might) be related to a function space, and subsequently related to a set of operators which can be applied to the function space, and then further quantitative relations developed (between the eigenvalues, the function space, and the set of operators).
This is the story of quantum physics, in regard to a random set of local spectral events, in turn, associated to a very stable quantum system whose distinguishable properties are discrete, and definitive.
[Are these random particlespectral events, or are they related to distinguished points of a stable geometric (containment) context?]
Thus, one asks,
Do such quantitative sets (associated to eigenvalues of function spaces) have any meaning?
Are the described patterns reliable?
In the context of indefinable randomness, the answer is, No.
For example, calculating business risks using these methods has shown that such calculations are not reliable.
Nonetheless, the contexts of the propaganda system can be adjusted so as to appear to be based on properties which are measurably verified properties.
The variations of thought concerning math and science's relation to practical creativity
The issue for both math and science is essentially the same, "describe existence," in regard to
either
1. Finding a descriptive construct which is both measurably consistent (with observed patterns) and useful, where measurability implies a process exists by which to build a new thing in the context of the system's description.
or
2. Finding a description which is quantitative, and it describes many of the details which are observed (or as many details as possible), so that the basis and organization of the descriptive construct is measurably verified, this could be the context in which a random event becomes distinguishable ie the descriptive properties (the context in which certain events are distinguishable) are based on indefinable randomness, so that the observed measurements (counting the distinguishable events) are (or appear to be) consistent with the quantitative representation of the system. Thus, one can form arbitrary quantitative relations associated to particular contexts (such as particular ways to use language), wherein certain properties become distinguishable and thus measurable, so as to give a context the appearance of scientific verification.
Furthermore, using a measurable description for practical creative development is not always necessary, especially if the math structures which already exist are already sufficient to build what business "wants built."
The trends (2012) of expression which are getting funded
The professional math community apparently has been encouraged to develop a way in which to quantize any distinguishable property, and thus the professional mathematicians deal mostly with the math constructs of:
1. Indefinable randomness (see below), and
2. Nonlinearity (when two quantitative sets are to be consistently related to one another then they need to be related by the multiplication of a constant, ie they need to have a linear relation existing between their constructs associated to quantitative comparisons, thus nonlinear relations are not quantitatively consistent, and lead to bifurcations (between the two quantitative sets being compared) and thus they lead to unresolvable chaos.
The two math constructs (of indefinable randomness, and nonlinearity) have quantitatively inconsistent set structures, and depend on quantitative sets which are "too big," leading to many logical inconsistencies, or to paradoxes which are irresolvable (or many of these paradoxes should not be resolved, since, if they are resolved, they lead to further math difficulties).
Though these math constructs can deal with a wide range of systems whose properties depend on a "great number of details," nonetheless these descriptions cannot describe the stable properties of fundamental systems, eg the stable spectral properties of nuclei, general atoms, molecules, and crystals, (as well as a valid description of the stable solar system) cannot be described using the laws of physics, and they are not within a context for description which lends itself to practical creative development (as the solvable aspects of classical physics are related to practical creative development).
Though the two descriptive contexts (of indefinable randomness, and nonlinearity), and associated math techniques, could possibly describe a wide range of details; this is done in a context in which (first) these details are being quantized, wherein distinguishable and random properties are identified, then these distinguishable properties are counted, so as to quantize the distinguishable property. However, if an elementaryevent set (of random events) is a set of unstable events, as well as having an indeterminable number of events, then the counting process does not fit into a valid quantitative set, ie the probabilities are unreliable (this is the context of indefinable randomness).
Furthermore, the probabilities of distinguishable values emerging from this process, in turn, might be related to spectralfunctions of function spaces where spectralfunctions can represent an event's probability, but if the spectralfunction's eigenvalues are not consistent with the spectral properties of the system (though the probabilities might be consistent with the relative counts of events) then such a quantization process... , eg applying operators to wavefunctions (so that the operators are to represent a physical property about which the wavefunctions form averages)... , is not valid.
That is, the distinguishable features of a physical system are not "reliably identifiable" in this method, and the probabilities of highly stable systems composed of relatively few particles is a descriptive context which is unrelatable to useful information about the system, and subsequently, unrelatable to further creative development (ie development to be realized by understanding the system's properties), ie the descriptions do not lead to further practical (or technical) development.
Probabilities can only provide control over linear systems, where the probabilities are related to averages of values found over large reservoirs of many components, but probabilities for componentevents in regard to systems composed of a relatively small number of components cannot be related to any type of control over the properties of such a system.
There is the possibility that many, if not most, of the observed (and measurable) details of existence might be outside of a consistent mathematical description.
Math constructs might not be able to describe both the fundamental patterns of stable material systems, and also describe all the details of the systems which are observed, such as nonlinear systems.
The limits of precise description
Thus, the fundamental question might be:
Does one want to describe the basis for stable material quantum systems (as well as the solar system),
or
Does one want to describe the details of random particleevents in space?
Perhaps, it is better to focus on describing the source of the stability of quantum systems.
Valid descriptions of nuclei, general atoms, molecules, and crystals, (as well as the stability of the solar system) based on physical law, currently (2012), do not exist.
Furthermore, the property that the systems of: nuclei, general atoms, molecules, and crystals; are very stable, and they are precisely consistent with the properties of "congruent" physical systems (eg systems composed of the same number of components), means that the correct descriptions of these systems are contained within a simple, geometric, and solvable math context (that is, the geometric math context described in a previous paragraph).
Math descriptions need to be measurably reliable and logically consistent, and one wants a geometrically based descriptions so that what is described can be used in a practical context.
For a math description to be stable and consistent the math needs to be:
1. about differential equations applied to measurable properties represented as functions
so that the differential equations are:
2. finite dimensional,
3. geometric,
4. linear,
5. metricinvariant, where
6. the differential equation determines a differential 2form, associated to geometry of "material" which is interacting with a component, so that
7. the 2form can be related, by means of the fiber group (and the fiber group's local geometry), to spatial displacements of the material component's "discrete shape's distinguished point", and
8. if (the linear differential equations are) separable, then they are also solvable, and controllable.
[For an interaction structure, which is related to a discrete Euclidean shape, the Euclidean interaction shape must average over the properties of the material systems (which are modeled as discrete hyperbolic shapes) which are interacting, eg analogous to an interacting system's centerofmass coordinates.]
{The spatial displacement of an interacting material component's distinguishedpoint is done in relation to the containing space's isometry (or unitary) fiber group.}
(This (above) is a math construct motivated by identifying a relation between forcefields and the spatial displacements of the interacting material, in a similar manner to Newton's law of force in Euclidean space, where in both the new and the Newtonian models there is actionatadistance.)
This, set of above mentioned properties, requires a containment set context of discrete Euclidean shapes and discrete hyperbolic shapes...
(excluding the nonlinear spherical shapes, ie including only the metricspaces which have nonpositive constant curvature)
... , as well as being able to also define "metricspace states" (used in the dynamic process, ie associated to a system's frame of reference [in Euclidean or hyperbolic space]) which, in turn, can be contained in a Hermitian space of complexcoordinates, which are best contained in an associated unitary structure.
Why has the above math constructs never been considered?
One trap "into which the professional math community seems to fall," is that they try to define sets (quantitative sets) which are "too big." In this context there is a sense that there will always be ways in which to quantitatively accommodate a math model, because there are too (so) many ways in which organize (or define) converging constructs.
Sets can be quantitatively shaped so as to conform to any observed pattern, especially if the math constructs are based on indefinable randomness and nonlinearity.
However, in the context of discrete Euclidean and hyperbolic shapes, where the discrete hyperbolic shapes determine stability, the set of quantitative properties upon which both the containment set and the properties of the material interactions... [which are contained (in the set)]... depend for their existence on a finite set (or on what may be a finite set).
That is, the new math construct is based on a finite quantitative set.
Education
The education system is such that spurious facts about certain contexts are memorized and considered, but never integrated into a person's personal picture of this knowledge, ie it is not knowledge based on freeinquiry which is to be used in a practically creative context.
This results in this type of education system identifying a certain type of individual... , who obsesses over memorized details and who possesses only a (specific) mental model unrelated to a valid context for useful descriptions of the observed patterns which are being described... , as having a superior mental capacity.
That is, it is only the Asperser's (and even more autistic mentalities, but who retain a capacity to talk) whom (which) can function in such a disconnected structure of knowledge.
These mental types are identified as mentally superior, and subsequently promoted as highly valued wageslaves.
This is social engineering, which depends on a very authoritarian management structure.
[This is about the extreme amount of militarized control that exists over the management of society's institutions.]
The first thing the public is required to accept (in regard to education) is that intelligence is identified as a property of being familiar with a wide range of certain aspects of culture.
That is, being able to absorb cultural details without placing them in a valid context of knowledge is identified as being intelligent.
This allows autism to be singledout as being mentally superior.
Such narrow detailed oriented (autistic) viewpoints fit well with business models where business risk and development of technology are interrelated.
The oil companies squelched the "renewable energy" ideas of (say) Tesla, in order to pursue the easiermade profits from oil.
Truly new technologies disrupt business models, eg microprocessors vs. the mainframe computer model of IBM (Apple and Microsoft interfered with the IBM business model, more so Apple).
This absurd, disconnected, relation between education and knowledge, which favors an autisticmind (which obsesses on narrow sets of abstract details), is presented as competition, and a subsequent determination of mental superiority (which is really thoughtless obsessive expressions of detailed memories, most often unrelated to practical, useful creativity). This is done to narrow thought within society, so that new technology development does not disrupt a businessmonopoly's business plan.
This is done, and is a form of social engineering which is allowed (enabled) by authoritarian hierarchical management, but it really causes knowledge to become very limited and narrow, and filled with many unrelateable details.
This is (also) a good criticism of particlephysics, which is very limited and narrow, and filled with many unrelateable details, since the forces caused by particlecollisions are unrelateable to the stable structures of the nucleus, the one thing (the nucleus) which should be explained by particlephysics, but the nucleus is not being carefully described. Nonetheless, probabilities of particlecollisions is related to the rate of (nuclear) reactions, and thus it is the best knowledge to be used by the nuclear weapons industry, and the nuclear industry does not want new physics being used to develop new technologies, since the existing monopolies are satisfied with the current state of knowledge.
Thus, particlephysics is being extended into either... stringtheory (a contorted geometric structure which tries to fit abstract algebraic patterns so as to remain consistent with materialism) or... algebraic extensions up into highdimension (but finitedimension) unitary (Hermitian) algebras, which are contained in a complexgeometry more related to solving algebraic equations than to pictureable geometries.
These descriptive extensions (of the observed patterns of particlephysics) have no motivation other than trying to fit data to math patterns related to probabilities of particlecollisions. Furthermore, these stringtheories etc, are math patterns unrelated to practical technical development.
Correlations (are emphasized over the stable and the causal)
Quantization is used to establish arbitrary correlations between variables (or between quantitative sets contained within an assumed containment set), but correlations are really about actual causal relations, whereas arbitrary correlations allow a supposedly descriptive context (descriptive language which is actually useful) to be identified, and thus these correlations (it seems to be believed) can form a basis for "research."
But it is a research immersed in a context of absurdity because of its great complications, eg many variables.
Indefinable randomness and nonlinearity, as the basis for measurable description (ie calculability), has allowed data (observed patterns) to be correlated to arbitrary math patterns, eg particlecollisions are associated to rates of reactions, but this is a very narrow viewpoint for a material interaction...
[where it is clear that subtle geometry is also a part of a material interaction process, where the 1/r geometry (and singularity) has been a thorninthe side of quantum physics]
... , which is used as a basis for describing material properties and its interactions, but it is a descriptive construct which has no relation to practical creative development.
Thus, people come to believe social value is associated to (noncreatively useful) complicated patterns arbitrarily related (by a vast sea of correlations) to measuring and datafitting.
Arbitrary distinguishable properties are most often not quantizeable in a precise and consistent descriptive language.
This is the core issue in regard to the failings of professional math communities.
Furthermore, measurable verification does not make the math pattern... ,
which has been (or is being) endlessly adjusted to fit the data
... , true.
For example, Ptolemy's ideas were not true, yet they were measurably verified.
Instead, the truth of a descriptive and precise math pattern needs to also be associated to a wide range of practical applicability in regard to furthering technical development, as are Newton's laws.
The description needs to not only be (relatively) accurate, but it also needs to be practically useful, eg leads to new inventions, ie it needs to be a geometric based description.
Is measuring about quantization and probabilities, or is it about (real) geometric measuring and building in a stable geometric context, wherein one can hold and use mental pictures (as one meditates in a creative context)?
Communication
The main reason people have trouble communicating is because there is only onevoice which is given supreme authority, and only those people who strive to be consistent with that authority are considered for (as) being a part of that same propaganda system.
This is the essence of "peer review" in professional science journals, journals which would exclude the voice of Copernicus, since Copernicus is (was) challenging the voice of authority.
Call authority science... , or call authority religion... , authority is a social construct most characterized by the violence required to maintain a social division within society of superior and inferior (ideas, or people etc).
The trick of the propaganda system is to construct a language around a "dialectic of Hegelian opposites" so that the language appears to be complete, as long as there is only onevoice, which is allowed to (which has the authority to) identify the middlepoints of the set of opposites (of the dialectic) for the (fake, or onesided) discussion.
In this discussion, both science and math are basically bottledup in a context of quantizing and correlating in a vast sea of complications, while obvious stable systems suggest that such a currently used (2012) approach of measurable description, based on randomness and nonlinearity, has little relevance to practical creativity and technical development,
ie knowledge related to practical creative development, which results in job development, is marginalized by a push to quantize and correlate.
An edifice of authority is built on a vast sea of complicated details, which are represented as verified science and rigorous math.
But these constructs are arbitrary.
A correlation which is valid 10% of the time, is considered a causal relation (or causally correlated relation).
Propaganda and quantifying random and nonlinear processes
The belief that one can quantify and measure any distinguishing feature which is observed in experience by the methods of indefinable randomness is the part of modern intellectual expression which has most led to an inability to creatively develop further aspects of the perceived and described human experience, and it has led to the stagnation and collapse of practically and creatively useful expressions of ideas.
Quantization is used to establish both arbitrary correlations, and quantitatively inconsistent relations which relate (arbitrary) observed patterns (or distinguishable features) to measured verifications (of distinguishable features within particular contexts) as well as to unstable contexts within which feedbackbased control over systems is developed.
Feedback is defined on a context of relativeness
This process of developing feedbackbased control over systems is most often about identifying a system's nonlinear differential equation, [where the range of validity of the nonlinear differential equation is indefinable, and thus is an unreliable means of control], and identifying the critical points of the differential equation, which, in turn, are related to "limit cycles" of the system's diverging or converging dynamic (or changing) properties, identified as regions in the solution function's domain space.
It is also done in a context of vast complication on very big containment sets (eg very big quantitative sets, or sets composed on many variables associated to some form of correlation).
Creativity and new forms of knowledge
Creativity depends on equality, while creativity opposes the context of narrow categories, and arbitrary measuring (claimed to verify a pattern, but it is a practically useless pattern) which has no relation to creativity and practically useful development.
Inequality and its associated establishment of narrow, and arbitrary categories, upon which hierarchies of value are defined, is a process which opposes creative development, and supports oppression defined within narrow categories [held in place by both proclamation (of law) and violence].
The basic sets of (fundamental) opposites which need to be considered (in the dialectic of Hegelian opposites of the propaganda system) are:
1. Equality (broad range of both knowledge and creativity)
vs.
2. Inequality (limited creativity).
I. Stability and consistency of a precise (math) language structure
(measuring associated to building)
vs.
II. quantization and datafitting, built on unstable and inconsistent set structures
(measuring associated to datafitting for contexts controlled by propaganda).
A. Geometry
vs.
B. Probability (or randomness, and function spaces)
Generalizing
The tendency of math to apply a single pattern to many contexts
Math wants to identify the most general (ie a wide class of patterns organized around one idea or one interpretation) and abstract context (ie mathematical abstraction means that it is not clear "to what" the pattern applies) within which quantitative (or measurable) and geometric patterns can be defined (or exist).
[the idea of generalizing is derived from classical physics, where the differential equation, F=ma, applies to a wide range of classical systems, where F is a 2form, defined in the absolute space (identified by m {or charge} where F=ma is unaffected by m's relative motion), where m's spatial displacements are measured as a secondorder time derivative, ie consistent with the order of the 2form, and there is continuity concerning mass, energy, momentum, and the dimension of the absolute space defined by m]
Though, F=ma, can be applied over a wide range of different systems, its descriptive construct is actually narrow and fairly specific (materialism, an absolute 3dimensional Euclidean space, interactions were assumed to be spherically symmetric, while only linear and separable differential equations were solvable).
In mathematics generalization has come to be about:
1. containment sets,
2. measured properties (coordinates and functions),
3. operators,
4. inverses,
5. continuity (or nearness),
5 (a). images and preimages,
6. holes in (domain) spaces,
7. functions and function spaces, and
8. (indefinable) randomness.
But this attempt at generalizing does not take into account the very important math notions of:
1. quantitative consistency,
2. size of sets,
3. geometric stability (or limitations of quantitatively consistent solvability), and
4. valid descriptions of randomness requiring: finiteness, and set stability, and well defined set constructs (so that counting makes sense).
That is, this focus on generalization and abstraction leads to a quantitative and operational basis for precise descriptions of an illusionary world (consistent with the general math constructs) and which is nonetheless measurably verified, but it lacks any relation to new practical creative development, ie this is the proof that the description is a description of math (or measurable) patterns "which exist in a world of illusion."
The context of classical physics is geometry and continuity, where systems can be linear or nonlinear,
while
The context of quantum physics is indefinable randomness and if a quantum system cannot be diagonalized, as the vast majority of quantum systems cannot be diagonalized, then the noncommutative uncertainty principle implies the system is nonlinear, but furthermore, material interactions in quantum physics are modeled as nonlinear particlecollisions, ie quantum physics is indefinably random and nonlinear.
Thus quantum descriptions have very little relation to practical creative development.

contribute to this article
add comment to discussion
