Predixion Software released Predixion Enterprise Insight 3.0, which lets users create, reuse, share, and deploy predictive applications. The latest version introduces a new machine-learning semantic model that makes it possible for non-technical users take advantage of predictive analytics on their data.
The value of data on the web has always been valuable. From blogs to websites to social media, data has enabled companies to make money. Who we are, what we like, what we buy and what we want to buy, are all valuable information (or data). Unfortunately, things are changing now.
Algebraix Data Corporation announced its SPARQL Server RDF database successfully executed all 17 of its queries on the SP2 benchmark up to one billion triples on one computer node. This computationally complex testing exceeds all others before it.
ProQuest has entered into an expanded agreement with publisher Schattauer GmbH to add to the German-language resources discoverable through the Summon service.
thē Call Center Corporation has partnered with Coginov to present the first in a series of free educational webinars. “Big Data Solutions that have Created Useable Technologies” is the first topic and will be held by executives of both companies on Wednesday, April 24th at 11:00 a.m. MST.
Military intelligence is teaming up with the science and experts of ontology to understand the challenges and solutions for storing big data. An April 18 workshop at the University at Buffalo (UB) will explore this big data conundrum, as well as related topics.
A group of enterprise architecture experts recently gathered to discuss big data, cloud computing and storage, and how these work best with overall information architecture. This interesting topic was brought to our attention by ZDNet in their article, “Complexity from big data and cloud trends makes architecture tools more powerful.” Simultaneous and complex trends, such […]
Human beings supposedly produce 2.5 quintillion bytes of data every day, according to IBM. Adding to that bit of trivia, 90 percent of the data in the world today was created in the past two years. The result is why we have the challenge of “big data” today. Enormous data sets that most relational database management systems (Oracle, IBM, Microsoft, etc.) find difficult to process. “Ontologies for Information Integration” (OI2), an April 18 workshop at the University at Buffalo, will address this challenge and offer some non-traditional solutions.
The MzTEK has coordinated an art commission titled, “Data as Culture.” MzTek’s goal is to provide a learning community in technology and arts for women. The aim of this particular commission was to highlight the use of data in an artistic context and to challenge the perceptions of what defines data.
Everyone talks about data – the growth of data, the importance of data, and so on. So how does data integrate into our business and actually become products? The process is referred to as “informationalization.” The premise is simple: Make existing products and services increasingly important and valuable to customers by building in more data.