Newsletters




Database Management

Relational Database Management Systems (RDBMSs) continue to do the heavy lifting in data management, while newer database management systems are taking on prominent roles as well. NoSQL database systems such as Key-Value, Column Family, Graph, and Document databases, are gaining acceptance due to their ability to handle unstructured and semi-structured data. MultiValue, sometimes called the fifth NoSQL database, is also a well-established database management technology which continues to evolve to address new enterprise requirements.



Database Management Articles

EnterpriseDB, designer of enterprise PostgreSQL and Oracle compatibility products and services, has released the latest version of its Advanced Server platform, Postgres Plus 9.1. The latest update offers improvements in read performance and write scalability, as well as greater flexibility and reliability due to its transaction-level control over synchronous replication, an industry first. Postgres Plus Advanced Server 9.1 is also more secure than previous versions due to the addition of Virtual Private Database, and has expanded its Oracle-compatible features.

Posted February 21, 2012

Objectivity, Inc. has announced the availability of a new version of its graph database - InfiniteGraph 2.1. The release includes three key new features - a new Plug-in Framework, integrated Visualizer, and support for Tinkerpop Blueprints. The new features are designed to allow application developers to get up and running faster, make queries reusable, and receive query results interactively.

Posted February 21, 2012

The Oracle Applications Users Group (OAUG) has announced new programs in the 2012 OAUG Educational Series, a virtual learning series of webinars offered exclusively to OAUG members. "OAUG members consistently report that education is one of the primary reasons for being a member," said Mark Clark, president of the OAUG. "The OAUG Educational Series is a prime example of how our organization continually serves its members by offering informative training events throughout the year to help them reap the most benefit from their Oracle investment."

Posted February 21, 2012

Composite Software has introduced version 6.1 of its Composite Data Virtualization Platform. The new release offers improved caching performance, expanded caching targets, data ship join for Teradata, and Hadoop MapReduce connectivity. Composite 6.1 also provides improvements to the data services development environment with an enhanced data services editor and new publishing options for Representational State Transfer (REST) and Open Data Protocol (OData) data services.

Posted February 17, 2012

At the TDWI 2012 Conference, Jaspersoft, a business intelligence (BI) software provider, announced an upgraded OEM agreement with Talend to include native connectors to Apache Hadoop big data environments in Jaspersoft ETL. With this enhanced ETL offering, Jaspersoft offers CIOs, data scientists, and BI builders the flexibility of three options to harness big data - direct reporting, direct real-time analysis, and batch analysis through ETL data mart access. "Today's data scientists want options to explore data faster," says Karl Van den Bergh, Jaspersoft's vice president of Product and Alliances.

Posted February 16, 2012

Oracle has introduced MySQL Cluster 7.2, which is designed to cost-effectively deliver high availability, high write scalability and low latency for demanding web-based and communications products and services. "The performance and flexibility enhancements in MySQL Cluster 7.2 provide users with a solid foundation for their mission-critical web workloads, blending the best of SQL and NoSQL technologies to reduce risk, cost and complexity," says Tomas Ulin, vice president of MySQL Engineering, Oracle.

Posted February 15, 2012

Oracle has announced the availability of Oracle Advanced Analytics, a new option for Oracle Database 11g that combines Oracle R Enterprise with Oracle Data Mining. According to Oracle, Oracle R Enterprise delivers enterprise class performance for users of the R statistical programming language, increasing the scale of data that can be analyzed by orders of magnitude using Oracle Database 11g.

Posted February 15, 2012

Lectures related to master data bring forth all sorts of taxonomies intended to help clarify master data and its place within an organization. Sliding scales may be presented: at the top, not master data; at the bottom, very much master data; in the middle, increasing degrees of "master data-ness." For the longest of times everyone thought metadata was confusing enough ... oops, we've done it again. And, we have accomplished the establishment of this master data semantic monster in quite a grand fashion.

Posted February 15, 2012

Today's organizations must capture, track, analyze and store more information than ever before - everything from mass quantities of transactional, online and mobile data, to growing amounts of "machine-generated data" such as call detail records, gaming data or sensor readings. And just as volumes are expanding into the tens of terabytes, and even the petabyte range and beyond, IT departments are facing increasing demands for real-time analytics. In this era of "big data," the challenges are as varied as the solutions available to address them. How can businesses store all their data? How can they mitigate the impact of data overload on application performance, speed and reliability? How can they manage and analyze large data sets both efficiently and cost effectively?

Posted February 09, 2012

Many types of data change over time, and different users and applications have requirements to access data at different points in time. A traditional DBMS stores data that is implied to be valid at the current point-in-time, it does not track the past or future states of the data. For some, the current, up-to-date values for the data are sufficient. But for others, accessing earlier versions of the data is needed. Temporal support makes it possible to store different database states and to query the data "as of" those different states.

Posted February 09, 2012

InterSystems Corporation, a provider of advanced database, integration and analytics technologies, announced it has become ISO 9001:2008 certified. ISO 9001:2008 is a quality management standard, and for InterSystems, the certification covers all processes related to the product and service creation associated with the InterSystems CACHÉ high-performance database and InterSystems Ensemble integration and development platform that are performed or managed from InterSystems' Cambridge-based headquarters.

Posted February 09, 2012

Ntirety, Inc. announced that it has been successfully audited and certified under the MSPAlliance's (MSPA) Unified Certification Standard for Cloud and Managed Service Providers (UCS). The certification is specifically designed to provide business consumers of cloud and managed services with the assurance that the service provider they hire will meet or exceed the highest principles of quality in areas such as financial stability, facilities, managed services practices, and customer satisfaction.

Posted February 07, 2012

KXEN, a provider of predictive analytics for business users, has certified its flagship product, InfiniteInsight, for Sybase IQ v.15, the column-based analytics server. The combination of the two solutions allows businesses to gain speed and performance in building predictive models and social network analysis to support business decisions.

Posted January 25, 2012

Star Analytics, Inc., a provider of application process automation and integration software, has released the latest version of its data bridging technology, Star Integration Server, which is intended to make it easy to extract and combine data from Oracle systems with other business intelligence (BI) applications and data warehouses.

Posted January 24, 2012

HiT Software, Inc. has announced a new release of its JDBC/DB2 type 4 SQL middleware, which conforms to the Java JDBC 4.1 specification. With this latest release, application developers can take advantage of added support for additional data types, improved security mechanisms and support for IBM DB2 on a wide range of systems and platforms.

Posted January 23, 2012

"Big data" and analytics have become the rage within the executive suite. The promise is immense - harness all the available information within the enterprise, regardless of data model or source, and mine it for insights that can't be seen any other way. In short, senior managers become more effective at business planning, spotting emerging trends and opportunities and anticipating crises because they have the means to see both the metaphorical trees and the forest at the same time. However, big data technologies don't come without a cost.

Posted January 11, 2012

Let's tie together the last several columns on "2012 Might Really be The End of the World." In this series, I discussed several megatrends in the general IT industry that will have a tremendous impact on the database administration (DBA) profession. The megatrends include both software-related (virtualization and cheap cloud database services) and hardware-related (SSDs and massively multi-core CPUs). These technologies have the potential to obviate many of the core competencies of the DBA, with the first two eliminating or lessening the need for server and hardware configuration and provisioning, and the last two diminishing the need for IO tuning and query tuning, respectively. But those are trends that will take years to reach fruition. What about the near future?

Posted January 11, 2012

Along with thousands of IT professionals, I was in the San Francisco Moscone Center main hall last October listening to Larry Ellison's 2011 Oracle Open world keynote. Larry can always be relied upon to give an entertaining presentation, a unique blend of both technology insights and amusingly disparaging remarks about competitors.

Posted January 11, 2012

Retaining the particulars of change over time is a fairly intricate configuration. Audit log or shadow tables are sometimes employed, but on occasion there is a need for the "old" and "new" rows to exist in a single operation table for application use. Far too often, the implementation of temporal data structures is shoddy, loose, and imprecise; rather than the fairly complex dance move such temporal arrangements must perform in actuality. The sub-optimal result is much like one's performance of the Funky Chicken at a friend's wedding; the desired moves are mimicked, after a fashion, but it is unlikely to earn high marks on "So You Think You Can Dance." The usual temporal implementation simply slaps on start and stop dates, debates a little over default date values versus NULLs, then moves on to the next subject.

Posted January 11, 2012

At the outset of each new year, I devote an edition of my column to review the significant data and database-related events of the previous year. Of course, to meet my deadlines, the column is written before the year is over (this column is being written in November 2011), so please excuse any significant news that may have happened late in December.

Posted January 11, 2012

The latest version of expressor software's flagship data integration platform, expressor 3.5, features cloud integration with Melissa Data's Data Quality Tools and Salesforce.com to provide comprehensive BI reporting and CRM integration with on premises applications. The new Salesforce.com and Melissa Data capabilities ship with expressor 3.5 Desktop Edition and Standard Edition.

Posted January 10, 2012

ScaleBase, Inc. has announced the results of its database benchmark test. ScaleBase has achieved 180,000 Transactions per Minute - the highest result for a MySQL database - while running on an Amazon RDS environment. According to the company, the ScaleBase Load Balancer solution proved how well it can scale MySQL, by running a DBT-2 benchmark, which is similar to the standard TPC-C benchmark, on the Amazon EC2 platform with the Amazon RDS database.

Posted January 10, 2012

Oracle has announced the availability of Oracle Solaris Studio 12.3, a C, C++, and Fortran development platform for building fast, scalable enterprise applications for Oracle Solaris systems. According to Oracle, the new release accelerates performance of SPARC T4 and x86-based applications up to 300% by leveraging Oracle's advanced compiler technology.

Posted January 04, 2012

BNP Paribas has implemented Oracle Exadata Database Machine to manage electronic trading floor data. BNP Paribas' data warehouse manages billions of messages in real-time processing a terabyte of raw data daily. A half-rack Oracle Exadata Database Machine has helped BNP Paribas better manage data growth and improve system performance.

Posted January 04, 2012

Sybase has launched the "Mobility Manifesto" site to support and encourage enterprise workers to have access to the devices and applications they want to use. The Mobility Manifesto allows enterprise workers to take a quiz to find out where their company ranks on mobility, share the results with their boss through an auto-fill letter that sets the tone for change in the workplace, and download an ibook including guidance and advice for both end user and IT.

Posted December 21, 2011

SAP AG has announced that since introducing SAP HANA a year ago customer and partner demand for the technology has surged. According to Dr. Vishal Sikka, member of the SAP executive board, Technology & Innovation, leading independent software vendors are adopting the open SAP HANA platform for their existing products and also building completely new applications as well. The company also announced at the recent SAP Influencer Summit 2011 in Boston that SAP HANA is at the core of its platform roadmap, powering both renewed applications without disruption as well as new ones.

Posted December 21, 2011

The first calendar year following SAP's acquisition of Sybase is coming to a close. David Jonker, director, product marketing - Data Management & Analytics, Sybase, discusses key product integrations, IT trends that loom large in Sybase's data management strategies, and the emergence of what Sybase describes as DW 2.0. 2011 has been "a foundational year," with effort focused on making Sybase technologies work with SAP and setting the stage for 2012, says Jonker. "We believe 2012 is going to be a big year for us on the database side."

Posted December 21, 2011

LexisNexis, a pioneer of information technology, has selected big data specialist MarkLogic to power components of the new platform behind Lexis Advance, its legal research solution. LexisNexis, a pioneer of information technology, has selected big data specialist MarkLogic to power components of the new platform behind Lexis Advance, its legal research solution.

Posted December 08, 2011

EMC Corporation has introduced the EMC Greenplum Unified Analytics Platform (UAP), a platform to support big data analytics, that combines the co-processing of structured and unstructured data with a productivity engine that enables collaboration among data scientists. The new EMC Greenplum UAP brings together the EMC Greenplum database for structured data, the enterprise Hadoop offering EMC Greenplum HD for the analysis and processing of unstructured data, and EMC Greenplum Chorus, its new productivity engine for data science teams. Greenplum UAP will be available in the first quarter of calendar 2012.

Posted December 08, 2011

A new blog on the SHARE website places the focus on the mainframe as a big data workhorse and the reigning alternative to internal (or external) cloud provision. Pedro Pereira, authoring the blog in SHARE's "President's Corner," makes several astute observations including identifying security and availability as unknowns in a cloud environment.

Posted December 06, 2011

Join Oracle and Unisphere for a live webcast to learn more about common practices that are most vulnerable to fraud and error, and the best practices and technologies used by leading vs. laggard organizations to drive the hidden costs out of operations and enforce process controls. Speakers will include Thomas J. Wilson, president, Unisphere Research; Joseph McKendrick, analyst, Unisphere Research; and Stephanie Maziol, director GRC Applications, Oracle.

Posted December 06, 2011

Three columns ago, I started a series of articles pointing out that tough times are a-comin' for the DBA profession due to major disruptive changes in the wider IT world (see "2012 Might Really Be the End of the World as We Know It"). In previous columns, I have told you about how our lives will change due to major technological changes caused by things such as Solid State Disks (SSD) and massively multicore CPUs.

Posted December 06, 2011

In a world replete with regulations and threats, organizations today have to go well beyond just securing their data. Protecting this most valuable asset means that companies have to perpetually monitor their systems in order to know who did exactly what, when and how - to their data.

Posted December 01, 2011

The cost for new development can often be easily justified. If a new function is needed, staffing a team to create such functionality and supporting data structures can be quantified and voted up or down by those controlling resources. Money can be found to build those things that move the organization forward; often, the expense may be covered by savings or increased revenue derived from providing the new services.

Posted December 01, 2011

expressor software, a provider of data integration software, has launched a product initiative aimed at simplifying the development of expressor and Teradata Express analytical database applications based on a diverse set of operational data sources. One of the biggest challenges people face is figuring out how to get data loaded into their database in order to begin using it for analysis and development without using traditional ETL solutions which can be very costly, Hugo Sheng, director, field engineering, expressor, tells 5 Minute Briefing.

Posted November 22, 2011

Being a successful database administrator requires far more than technical acumen and database knowledge. DBAs should be armed with a proper attitude as well as sufficient fortitude and personality before attempting to practice database administration. Gaining the technical know-how is important, yes, but there are many sources that offer technical guidance for DBAs. The non-technical aspects of DBA are just as challenging, though. So with that in mind, this month's column will offer 10 "rules of thumb" for DBAs to follow as they improve their soft skills.

Posted November 22, 2011

UC4, an IT automation software vendor, announced a partnership with Basis Technologies International (BTI), a provider of add-on solutions that optimize SAP, intended to deliver targeted automation solutions for SAP customers. The joint offering, dubbed "UC4 MDR powered by BTI," will integrate BTI's Mass Data Runtime - an SAP NetWeaver-based solution that delivers data processing improvements - with UC4's ONE Automation platform. The combination of these two products is designed to help accelerate and orchestrate business processes, applications and infrastructure with SAP environments.

Posted November 16, 2011

Sybase, an SAP company, has unveiled a new guide on big data analytics titled "Intelligence for Everyone: Transforming Business Analytics Across the Enterprise." The objective of the guide is to demonstrate through facts and examples that there are tools and methods to make sense of new and massive data sets, service all users at all levels, and analyze many different data types to provide actionable answers.

Posted November 16, 2011

Oracle has introduced PeopleSoft HCM 9.1 Feature Pack 2 which provides a consumer-like self-service user experience to improve the way employees and managers perform their day-to-day activities. Oracle's Feature Packs enable a quicker response to PeopleSoft customer requests, while enabling customers to choose how and when to deploy the new functionality to their users.

Posted November 16, 2011

Oracle has announced the availability of Oracle Solaris 11, which the company says can meet the security, performance and scalability requirements of cloud-based deployments and enable customers to run their most demanding enterprise applications in private, hybrid, or public clouds.

Posted November 16, 2011

SAP AG has announced the availability of SAP NetWeaver Business Warehouse (SAP NetWeaver BW) 7.3 running on the SAP HANA platform. SAP HANA can enhance the query performance and provide faster data loads in SAP NetWeaver BW, and help customers significantly reduce the total cost of ownership for customers by simplifying the administration with reduced data layers, according to SAP. The announcement was made at the SAPPHIRE NOW and SAP TechEd co-located event in Madrid.

Posted November 16, 2011

Sybase has unveiled a new version of the Sybase IQ high performance column-based analytics database, due to be generally available by the end of November. "This really is an extension into big data - and big data is characterized by a lot of things - but we see the trends in the market around MapReduce and Hadoop in database analytics and we have added those capabilities into IQ 15. 4," Dan Lahl, director of product marketing at Sybase, tells 5 Minute Briefing. With the new release of IQ, he notes, Sybase IQ provides customers a "have it your way" approach.

Posted November 16, 2011

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146

Sponsors